[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 15494 1726853330.67253: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Qi7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 15494 1726853330.68267: Added group all to inventory 15494 1726853330.68269: Added group ungrouped to inventory 15494 1726853330.68276: Group all now contains ungrouped 15494 1726853330.68280: Examining possible inventory source: /tmp/network-iHm/inventory.yml 15494 1726853331.00078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 15494 1726853331.00142: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 15494 1726853331.00167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 15494 1726853331.00232: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 15494 1726853331.00358: Loaded config def from plugin (inventory/script) 15494 1726853331.00360: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 15494 1726853331.00532: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 15494 1726853331.00683: Loaded config def from plugin (inventory/yaml) 15494 1726853331.00685: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 15494 1726853331.00873: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 15494 1726853331.01804: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 15494 1726853331.01807: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 15494 1726853331.01810: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 15494 1726853331.01816: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 15494 1726853331.01821: Loading data from /tmp/network-iHm/inventory.yml 15494 1726853331.01890: /tmp/network-iHm/inventory.yml was not parsable by auto 15494 1726853331.01944: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 15494 1726853331.02286: Loading data from /tmp/network-iHm/inventory.yml 15494 1726853331.02369: group all already in inventory 15494 1726853331.02377: set inventory_file for managed_node1 15494 1726853331.02382: set inventory_dir for managed_node1 15494 1726853331.02383: Added host managed_node1 to inventory 15494 1726853331.02385: Added host managed_node1 to group all 15494 1726853331.02386: set ansible_host for managed_node1 15494 1726853331.02387: set ansible_ssh_extra_args for managed_node1 15494 1726853331.02390: set inventory_file for managed_node2 15494 1726853331.02392: set inventory_dir for managed_node2 15494 1726853331.02393: Added host managed_node2 to inventory 15494 1726853331.02394: Added host managed_node2 to group all 15494 1726853331.02395: set ansible_host for managed_node2 15494 1726853331.02396: set ansible_ssh_extra_args for managed_node2 15494 1726853331.02399: set inventory_file for managed_node3 15494 1726853331.02401: set inventory_dir for managed_node3 15494 1726853331.02402: Added host managed_node3 to inventory 15494 1726853331.02403: Added host managed_node3 to group all 15494 1726853331.02404: set ansible_host for managed_node3 15494 1726853331.02405: set ansible_ssh_extra_args for managed_node3 15494 1726853331.02408: Reconcile groups and hosts in inventory. 15494 1726853331.02411: Group ungrouped now contains managed_node1 15494 1726853331.02413: Group ungrouped now contains managed_node2 15494 1726853331.02415: Group ungrouped now contains managed_node3 15494 1726853331.02700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 15494 1726853331.02830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 15494 1726853331.03085: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 15494 1726853331.03118: Loaded config def from plugin (vars/host_group_vars) 15494 1726853331.03120: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 15494 1726853331.03127: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 15494 1726853331.03136: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 15494 1726853331.03186: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 15494 1726853331.03943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853331.04044: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 15494 1726853331.04292: Loaded config def from plugin (connection/local) 15494 1726853331.04295: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 15494 1726853331.05533: Loaded config def from plugin (connection/paramiko_ssh) 15494 1726853331.05537: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 15494 1726853331.07231: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15494 1726853331.07479: Loaded config def from plugin (connection/psrp) 15494 1726853331.07482: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 15494 1726853331.08485: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15494 1726853331.08526: Loaded config def from plugin (connection/ssh) 15494 1726853331.08529: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 15494 1726853331.11998: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15494 1726853331.12039: Loaded config def from plugin (connection/winrm) 15494 1726853331.12042: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 15494 1726853331.12080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 15494 1726853331.12148: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 15494 1726853331.12420: Loaded config def from plugin (shell/cmd) 15494 1726853331.12422: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 15494 1726853331.12451: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 15494 1726853331.12521: Loaded config def from plugin (shell/powershell) 15494 1726853331.12523: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 15494 1726853331.12783: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 15494 1726853331.12965: Loaded config def from plugin (shell/sh) 15494 1726853331.12968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 15494 1726853331.13205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 15494 1726853331.13327: Loaded config def from plugin (become/runas) 15494 1726853331.13330: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 15494 1726853331.13723: Loaded config def from plugin (become/su) 15494 1726853331.13725: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 15494 1726853331.14089: Loaded config def from plugin (become/sudo) 15494 1726853331.14091: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 15494 1726853331.14125: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 15494 1726853331.14849: in VariableManager get_vars() 15494 1726853331.14868: done with get_vars() 15494 1726853331.15203: trying /usr/local/lib/python3.12/site-packages/ansible/modules 15494 1726853331.19294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 15494 1726853331.19409: in VariableManager get_vars() 15494 1726853331.19415: done with get_vars() 15494 1726853331.19417: variable 'playbook_dir' from source: magic vars 15494 1726853331.19418: variable 'ansible_playbook_python' from source: magic vars 15494 1726853331.19419: variable 'ansible_config_file' from source: magic vars 15494 1726853331.19420: variable 'groups' from source: magic vars 15494 1726853331.19421: variable 'omit' from source: magic vars 15494 1726853331.19421: variable 'ansible_version' from source: magic vars 15494 1726853331.19423: variable 'ansible_check_mode' from source: magic vars 15494 1726853331.19424: variable 'ansible_diff_mode' from source: magic vars 15494 1726853331.19424: variable 'ansible_forks' from source: magic vars 15494 1726853331.19425: variable 'ansible_inventory_sources' from source: magic vars 15494 1726853331.19426: variable 'ansible_skip_tags' from source: magic vars 15494 1726853331.19426: variable 'ansible_limit' from source: magic vars 15494 1726853331.19427: variable 'ansible_run_tags' from source: magic vars 15494 1726853331.19428: variable 'ansible_verbosity' from source: magic vars 15494 1726853331.19465: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml 15494 1726853331.19852: in VariableManager get_vars() 15494 1726853331.19983: done with get_vars() 15494 1726853331.20019: in VariableManager get_vars() 15494 1726853331.20031: done with get_vars() 15494 1726853331.20062: in VariableManager get_vars() 15494 1726853331.20075: done with get_vars() 15494 1726853331.20143: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15494 1726853331.20462: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15494 1726853331.20795: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15494 1726853331.22155: in VariableManager get_vars() 15494 1726853331.22381: done with get_vars() 15494 1726853331.23217: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 15494 1726853331.23344: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15494 1726853331.26039: in VariableManager get_vars() 15494 1726853331.26043: done with get_vars() 15494 1726853331.26047: variable 'playbook_dir' from source: magic vars 15494 1726853331.26048: variable 'ansible_playbook_python' from source: magic vars 15494 1726853331.26049: variable 'ansible_config_file' from source: magic vars 15494 1726853331.26050: variable 'groups' from source: magic vars 15494 1726853331.26051: variable 'omit' from source: magic vars 15494 1726853331.26051: variable 'ansible_version' from source: magic vars 15494 1726853331.26052: variable 'ansible_check_mode' from source: magic vars 15494 1726853331.26053: variable 'ansible_diff_mode' from source: magic vars 15494 1726853331.26054: variable 'ansible_forks' from source: magic vars 15494 1726853331.26054: variable 'ansible_inventory_sources' from source: magic vars 15494 1726853331.26055: variable 'ansible_skip_tags' from source: magic vars 15494 1726853331.26056: variable 'ansible_limit' from source: magic vars 15494 1726853331.26057: variable 'ansible_run_tags' from source: magic vars 15494 1726853331.26057: variable 'ansible_verbosity' from source: magic vars 15494 1726853331.26202: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15494 1726853331.26417: in VariableManager get_vars() 15494 1726853331.26431: done with get_vars() 15494 1726853331.26467: in VariableManager get_vars() 15494 1726853331.26470: done with get_vars() 15494 1726853331.26474: variable 'playbook_dir' from source: magic vars 15494 1726853331.26475: variable 'ansible_playbook_python' from source: magic vars 15494 1726853331.26475: variable 'ansible_config_file' from source: magic vars 15494 1726853331.26476: variable 'groups' from source: magic vars 15494 1726853331.26477: variable 'omit' from source: magic vars 15494 1726853331.26478: variable 'ansible_version' from source: magic vars 15494 1726853331.26478: variable 'ansible_check_mode' from source: magic vars 15494 1726853331.26479: variable 'ansible_diff_mode' from source: magic vars 15494 1726853331.26480: variable 'ansible_forks' from source: magic vars 15494 1726853331.26480: variable 'ansible_inventory_sources' from source: magic vars 15494 1726853331.26481: variable 'ansible_skip_tags' from source: magic vars 15494 1726853331.26482: variable 'ansible_limit' from source: magic vars 15494 1726853331.26482: variable 'ansible_run_tags' from source: magic vars 15494 1726853331.26483: variable 'ansible_verbosity' from source: magic vars 15494 1726853331.26513: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15494 1726853331.26778: in VariableManager get_vars() 15494 1726853331.26792: done with get_vars() 15494 1726853331.26840: in VariableManager get_vars() 15494 1726853331.26844: done with get_vars() 15494 1726853331.26848: variable 'playbook_dir' from source: magic vars 15494 1726853331.26849: variable 'ansible_playbook_python' from source: magic vars 15494 1726853331.26850: variable 'ansible_config_file' from source: magic vars 15494 1726853331.26851: variable 'groups' from source: magic vars 15494 1726853331.26852: variable 'omit' from source: magic vars 15494 1726853331.26853: variable 'ansible_version' from source: magic vars 15494 1726853331.26853: variable 'ansible_check_mode' from source: magic vars 15494 1726853331.26854: variable 'ansible_diff_mode' from source: magic vars 15494 1726853331.26855: variable 'ansible_forks' from source: magic vars 15494 1726853331.26860: variable 'ansible_inventory_sources' from source: magic vars 15494 1726853331.26861: variable 'ansible_skip_tags' from source: magic vars 15494 1726853331.26862: variable 'ansible_limit' from source: magic vars 15494 1726853331.26863: variable 'ansible_run_tags' from source: magic vars 15494 1726853331.26863: variable 'ansible_verbosity' from source: magic vars 15494 1726853331.27099: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 15494 1726853331.27168: in VariableManager get_vars() 15494 1726853331.27217: done with get_vars() 15494 1726853331.27219: variable 'playbook_dir' from source: magic vars 15494 1726853331.27220: variable 'ansible_playbook_python' from source: magic vars 15494 1726853331.27221: variable 'ansible_config_file' from source: magic vars 15494 1726853331.27222: variable 'groups' from source: magic vars 15494 1726853331.27223: variable 'omit' from source: magic vars 15494 1726853331.27223: variable 'ansible_version' from source: magic vars 15494 1726853331.27224: variable 'ansible_check_mode' from source: magic vars 15494 1726853331.27225: variable 'ansible_diff_mode' from source: magic vars 15494 1726853331.27225: variable 'ansible_forks' from source: magic vars 15494 1726853331.27226: variable 'ansible_inventory_sources' from source: magic vars 15494 1726853331.27227: variable 'ansible_skip_tags' from source: magic vars 15494 1726853331.27227: variable 'ansible_limit' from source: magic vars 15494 1726853331.27228: variable 'ansible_run_tags' from source: magic vars 15494 1726853331.27229: variable 'ansible_verbosity' from source: magic vars 15494 1726853331.27260: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 15494 1726853331.27326: in VariableManager get_vars() 15494 1726853331.27338: done with get_vars() 15494 1726853331.27680: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15494 1726853331.27797: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15494 1726853331.28070: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15494 1726853331.28428: in VariableManager get_vars() 15494 1726853331.28449: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15494 1726853331.29973: in VariableManager get_vars() 15494 1726853331.29987: done with get_vars() 15494 1726853331.30021: in VariableManager get_vars() 15494 1726853331.30023: done with get_vars() 15494 1726853331.30025: variable 'playbook_dir' from source: magic vars 15494 1726853331.30026: variable 'ansible_playbook_python' from source: magic vars 15494 1726853331.30027: variable 'ansible_config_file' from source: magic vars 15494 1726853331.30028: variable 'groups' from source: magic vars 15494 1726853331.30028: variable 'omit' from source: magic vars 15494 1726853331.30029: variable 'ansible_version' from source: magic vars 15494 1726853331.30030: variable 'ansible_check_mode' from source: magic vars 15494 1726853331.30030: variable 'ansible_diff_mode' from source: magic vars 15494 1726853331.30031: variable 'ansible_forks' from source: magic vars 15494 1726853331.30032: variable 'ansible_inventory_sources' from source: magic vars 15494 1726853331.30032: variable 'ansible_skip_tags' from source: magic vars 15494 1726853331.30033: variable 'ansible_limit' from source: magic vars 15494 1726853331.30034: variable 'ansible_run_tags' from source: magic vars 15494 1726853331.30034: variable 'ansible_verbosity' from source: magic vars 15494 1726853331.30066: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 15494 1726853331.30137: in VariableManager get_vars() 15494 1726853331.30151: done with get_vars() 15494 1726853331.30191: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15494 1726853331.30399: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15494 1726853331.30477: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15494 1726853331.32313: in VariableManager get_vars() 15494 1726853331.32330: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15494 1726853331.33841: in VariableManager get_vars() 15494 1726853331.33845: done with get_vars() 15494 1726853331.33849: variable 'playbook_dir' from source: magic vars 15494 1726853331.33850: variable 'ansible_playbook_python' from source: magic vars 15494 1726853331.33851: variable 'ansible_config_file' from source: magic vars 15494 1726853331.33852: variable 'groups' from source: magic vars 15494 1726853331.33852: variable 'omit' from source: magic vars 15494 1726853331.33853: variable 'ansible_version' from source: magic vars 15494 1726853331.33854: variable 'ansible_check_mode' from source: magic vars 15494 1726853331.33855: variable 'ansible_diff_mode' from source: magic vars 15494 1726853331.33855: variable 'ansible_forks' from source: magic vars 15494 1726853331.33856: variable 'ansible_inventory_sources' from source: magic vars 15494 1726853331.33857: variable 'ansible_skip_tags' from source: magic vars 15494 1726853331.33857: variable 'ansible_limit' from source: magic vars 15494 1726853331.33858: variable 'ansible_run_tags' from source: magic vars 15494 1726853331.33859: variable 'ansible_verbosity' from source: magic vars 15494 1726853331.33890: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15494 1726853331.33955: in VariableManager get_vars() 15494 1726853331.33979: done with get_vars() 15494 1726853331.34010: in VariableManager get_vars() 15494 1726853331.34014: done with get_vars() 15494 1726853331.34016: variable 'playbook_dir' from source: magic vars 15494 1726853331.34017: variable 'ansible_playbook_python' from source: magic vars 15494 1726853331.34017: variable 'ansible_config_file' from source: magic vars 15494 1726853331.34018: variable 'groups' from source: magic vars 15494 1726853331.34019: variable 'omit' from source: magic vars 15494 1726853331.34020: variable 'ansible_version' from source: magic vars 15494 1726853331.34020: variable 'ansible_check_mode' from source: magic vars 15494 1726853331.34021: variable 'ansible_diff_mode' from source: magic vars 15494 1726853331.34022: variable 'ansible_forks' from source: magic vars 15494 1726853331.34022: variable 'ansible_inventory_sources' from source: magic vars 15494 1726853331.34023: variable 'ansible_skip_tags' from source: magic vars 15494 1726853331.34024: variable 'ansible_limit' from source: magic vars 15494 1726853331.34024: variable 'ansible_run_tags' from source: magic vars 15494 1726853331.34025: variable 'ansible_verbosity' from source: magic vars 15494 1726853331.34055: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15494 1726853331.34117: in VariableManager get_vars() 15494 1726853331.34129: done with get_vars() 15494 1726853331.34186: in VariableManager get_vars() 15494 1726853331.34197: done with get_vars() 15494 1726853331.34282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 15494 1726853331.34295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 15494 1726853331.34507: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 15494 1726853331.34661: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 15494 1726853331.34667: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 15494 1726853331.34902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 15494 1726853331.34932: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 15494 1726853331.35344: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 15494 1726853331.35409: Loaded config def from plugin (callback/default) 15494 1726853331.35412: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 15494 1726853331.37956: Loaded config def from plugin (callback/junit) 15494 1726853331.37959: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 15494 1726853331.38004: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 15494 1726853331.38069: Loaded config def from plugin (callback/minimal) 15494 1726853331.38274: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 15494 1726853331.38314: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 15494 1726853331.38376: Loaded config def from plugin (callback/tree) 15494 1726853331.38379: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 15494 1726853331.38702: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 15494 1726853331.38705: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bridge_nm.yml ************************************************** 11 plays in /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 15494 1726853331.38731: in VariableManager get_vars() 15494 1726853331.38749: done with get_vars() 15494 1726853331.38757: in VariableManager get_vars() 15494 1726853331.38766: done with get_vars() 15494 1726853331.38773: variable 'omit' from source: magic vars 15494 1726853331.38811: in VariableManager get_vars() 15494 1726853331.38826: done with get_vars() 15494 1726853331.38850: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bridge.yml' with nm as provider] *********** 15494 1726853331.40023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 15494 1726853331.40302: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 15494 1726853331.40391: getting the remaining hosts for this loop 15494 1726853331.40392: done getting the remaining hosts for this loop 15494 1726853331.40395: getting the next task for host managed_node1 15494 1726853331.40399: done getting next task for host managed_node1 15494 1726853331.40400: ^ task is: TASK: Gathering Facts 15494 1726853331.40402: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853331.40404: getting variables 15494 1726853331.40405: in VariableManager get_vars() 15494 1726853331.40415: Calling all_inventory to load vars for managed_node1 15494 1726853331.40417: Calling groups_inventory to load vars for managed_node1 15494 1726853331.40420: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853331.40432: Calling all_plugins_play to load vars for managed_node1 15494 1726853331.40442: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853331.40449: Calling groups_plugins_play to load vars for managed_node1 15494 1726853331.40486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853331.40541: done with get_vars() 15494 1726853331.40550: done getting variables 15494 1726853331.40717: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 Friday 20 September 2024 13:28:51 -0400 (0:00:00.023) 0:00:00.023 ****** 15494 1726853331.40739: entering _queue_task() for managed_node1/gather_facts 15494 1726853331.40740: Creating lock for gather_facts 15494 1726853331.41707: worker is 1 (out of 1 available) 15494 1726853331.41715: exiting _queue_task() for managed_node1/gather_facts 15494 1726853331.41726: done queuing things up, now waiting for results queue to drain 15494 1726853331.41728: waiting for pending results... 15494 1726853331.41907: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15494 1726853331.41978: in run() - task 02083763-bbaf-0028-1a50-00000000007e 15494 1726853331.42180: variable 'ansible_search_path' from source: unknown 15494 1726853331.42184: calling self._execute() 15494 1726853331.42408: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853331.42413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853331.42424: variable 'omit' from source: magic vars 15494 1726853331.42637: variable 'omit' from source: magic vars 15494 1726853331.42665: variable 'omit' from source: magic vars 15494 1726853331.42759: variable 'omit' from source: magic vars 15494 1726853331.42800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853331.42834: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853331.42885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853331.42906: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853331.42921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853331.42957: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853331.42965: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853331.42975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853331.43079: Set connection var ansible_connection to ssh 15494 1726853331.43090: Set connection var ansible_pipelining to False 15494 1726853331.43099: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853331.43106: Set connection var ansible_shell_type to sh 15494 1726853331.43114: Set connection var ansible_timeout to 10 15494 1726853331.43124: Set connection var ansible_shell_executable to /bin/sh 15494 1726853331.43192: variable 'ansible_shell_executable' from source: unknown 15494 1726853331.43201: variable 'ansible_connection' from source: unknown 15494 1726853331.43208: variable 'ansible_module_compression' from source: unknown 15494 1726853331.43214: variable 'ansible_shell_type' from source: unknown 15494 1726853331.43219: variable 'ansible_shell_executable' from source: unknown 15494 1726853331.43225: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853331.43232: variable 'ansible_pipelining' from source: unknown 15494 1726853331.43237: variable 'ansible_timeout' from source: unknown 15494 1726853331.43244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853331.43427: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853331.43443: variable 'omit' from source: magic vars 15494 1726853331.43455: starting attempt loop 15494 1726853331.43462: running the handler 15494 1726853331.43494: variable 'ansible_facts' from source: unknown 15494 1726853331.43508: _low_level_execute_command(): starting 15494 1726853331.43576: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853331.44265: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853331.44373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853331.44404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853331.44478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853331.46152: stdout chunk (state=3): >>>/root <<< 15494 1726853331.46303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853331.46307: stdout chunk (state=3): >>><<< 15494 1726853331.46309: stderr chunk (state=3): >>><<< 15494 1726853331.46421: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853331.46425: _low_level_execute_command(): starting 15494 1726853331.46428: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853331.4633284-15530-51624114495273 `" && echo ansible-tmp-1726853331.4633284-15530-51624114495273="` echo /root/.ansible/tmp/ansible-tmp-1726853331.4633284-15530-51624114495273 `" ) && sleep 0' 15494 1726853331.47295: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853331.47310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853331.47331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853331.47376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 15494 1726853331.47440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853331.47491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853331.47514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853331.47531: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853331.47675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853331.49509: stdout chunk (state=3): >>>ansible-tmp-1726853331.4633284-15530-51624114495273=/root/.ansible/tmp/ansible-tmp-1726853331.4633284-15530-51624114495273 <<< 15494 1726853331.49636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853331.49738: stderr chunk (state=3): >>><<< 15494 1726853331.49936: stdout chunk (state=3): >>><<< 15494 1726853331.49940: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853331.4633284-15530-51624114495273=/root/.ansible/tmp/ansible-tmp-1726853331.4633284-15530-51624114495273 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853331.49943: variable 'ansible_module_compression' from source: unknown 15494 1726853331.50280: ANSIBALLZ: Using generic lock for ansible.legacy.setup 15494 1726853331.50283: ANSIBALLZ: Acquiring lock 15494 1726853331.50285: ANSIBALLZ: Lock acquired: 140002372342736 15494 1726853331.50287: ANSIBALLZ: Creating module 15494 1726853331.94096: ANSIBALLZ: Writing module into payload 15494 1726853331.94226: ANSIBALLZ: Writing module 15494 1726853331.94250: ANSIBALLZ: Renaming module 15494 1726853331.94254: ANSIBALLZ: Done creating module 15494 1726853331.94292: variable 'ansible_facts' from source: unknown 15494 1726853331.94297: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853331.94306: _low_level_execute_command(): starting 15494 1726853331.94312: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 15494 1726853331.94879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853331.94888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853331.94898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853331.94911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853331.94922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853331.94929: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853331.94938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853331.94953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853331.94958: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 15494 1726853331.94965: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15494 1726853331.94974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853331.94982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853331.94993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853331.95000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853331.95006: stderr chunk (state=3): >>>debug2: match found <<< 15494 1726853331.95015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853331.95076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853331.95089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853331.95278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853331.95452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853331.97148: stdout chunk (state=3): >>>PLATFORM <<< 15494 1726853331.97217: stdout chunk (state=3): >>>Linux <<< 15494 1726853331.97236: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 15494 1726853331.97242: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 15494 1726853331.97417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853331.97421: stdout chunk (state=3): >>><<< 15494 1726853331.97428: stderr chunk (state=3): >>><<< 15494 1726853331.97520: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853331.97531 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 15494 1726853331.97577: _low_level_execute_command(): starting 15494 1726853331.97678: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 15494 1726853331.97718: Sending initial data 15494 1726853331.97722: Sent initial data (1181 bytes) 15494 1726853331.98905: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853331.98956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853331.98976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853331.99089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853332.02507: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 15494 1726853332.03077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853332.03081: stdout chunk (state=3): >>><<< 15494 1726853332.03084: stderr chunk (state=3): >>><<< 15494 1726853332.03087: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853332.03089: variable 'ansible_facts' from source: unknown 15494 1726853332.03091: variable 'ansible_facts' from source: unknown 15494 1726853332.03093: variable 'ansible_module_compression' from source: unknown 15494 1726853332.03233: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15494 1726853332.03276: variable 'ansible_facts' from source: unknown 15494 1726853332.03677: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853331.4633284-15530-51624114495273/AnsiballZ_setup.py 15494 1726853332.04005: Sending initial data 15494 1726853332.04015: Sent initial data (153 bytes) 15494 1726853332.05189: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853332.05202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15494 1726853332.05213: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853332.05282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853332.05293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853332.05489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853332.05563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853332.07205: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853332.07279: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853332.07300: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853331.4633284-15530-51624114495273/AnsiballZ_setup.py" <<< 15494 1726853332.07363: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp84i2aniz /root/.ansible/tmp/ansible-tmp-1726853331.4633284-15530-51624114495273/AnsiballZ_setup.py <<< 15494 1726853332.07387: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp84i2aniz" to remote "/root/.ansible/tmp/ansible-tmp-1726853331.4633284-15530-51624114495273/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853331.4633284-15530-51624114495273/AnsiballZ_setup.py" <<< 15494 1726853332.09897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853332.10005: stderr chunk (state=3): >>><<< 15494 1726853332.10008: stdout chunk (state=3): >>><<< 15494 1726853332.10020: done transferring module to remote 15494 1726853332.10113: _low_level_execute_command(): starting 15494 1726853332.10116: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853331.4633284-15530-51624114495273/ /root/.ansible/tmp/ansible-tmp-1726853331.4633284-15530-51624114495273/AnsiballZ_setup.py && sleep 0' 15494 1726853332.11305: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853332.11367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853332.11605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853332.11610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853332.11678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853332.11786: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853332.13572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853332.13625: stderr chunk (state=3): >>><<< 15494 1726853332.13628: stdout chunk (state=3): >>><<< 15494 1726853332.13719: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853332.13722: _low_level_execute_command(): starting 15494 1726853332.13725: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853331.4633284-15530-51624114495273/AnsiballZ_setup.py && sleep 0' 15494 1726853332.14325: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853332.14375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853332.14388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853332.14402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853332.14487: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853332.14510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853332.14595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853332.16755: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15494 1726853332.16778: stdout chunk (state=3): >>>import _imp # builtin <<< 15494 1726853332.16802: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 15494 1726853332.16853: stdout chunk (state=3): >>>import '_io' # <<< 15494 1726853332.16875: stdout chunk (state=3): >>>import 'marshal' # <<< 15494 1726853332.16894: stdout chunk (state=3): >>>import 'posix' # <<< 15494 1726853332.17142: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd928184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd927e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 15494 1726853332.17147: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9281aa50> <<< 15494 1726853332.17163: stdout chunk (state=3): >>>import '_signal' # <<< 15494 1726853332.17189: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 15494 1726853332.17220: stdout chunk (state=3): >>>import 'io' # <<< 15494 1726853332.17236: stdout chunk (state=3): >>>import '_stat' # <<< 15494 1726853332.17269: stdout chunk (state=3): >>>import 'stat' # <<< 15494 1726853332.17324: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15494 1726853332.17353: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 15494 1726853332.17455: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 15494 1726853332.17489: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92609130> <<< 15494 1726853332.17563: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853332.17574: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92609fa0> <<< 15494 1726853332.17614: stdout chunk (state=3): >>>import 'site' # <<< 15494 1726853332.17617: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15494 1726853332.18068: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15494 1726853332.18080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 15494 1726853332.18104: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 15494 1726853332.18139: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 15494 1726853332.18143: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92647dd0> <<< 15494 1726853332.18178: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15494 1726853332.18181: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 15494 1726853332.18209: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92647fe0> <<< 15494 1726853332.18223: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15494 1726853332.18244: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15494 1726853332.18268: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15494 1726853332.18328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853332.18356: stdout chunk (state=3): >>>import 'itertools' # <<< 15494 1726853332.18366: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9267f800> <<< 15494 1726853332.18392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9267fe90><<< 15494 1726853332.18421: stdout chunk (state=3): >>> import '_collections' # <<< 15494 1726853332.18460: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9265faa0> <<< 15494 1726853332.18470: stdout chunk (state=3): >>>import '_functools' # <<< 15494 1726853332.18490: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9265d1c0> <<< 15494 1726853332.18582: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92644f80> <<< 15494 1726853332.18604: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15494 1726853332.18633: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 15494 1726853332.18658: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15494 1726853332.18692: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15494 1726853332.18704: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 15494 1726853332.18781: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9269f6e0> <<< 15494 1726853332.18784: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9269e300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9265e060> <<< 15494 1726853332.18806: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92646e70> <<< 15494 1726853332.18838: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 15494 1726853332.18853: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926d47a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92644200> <<< 15494 1726853332.18898: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853332.18912: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd926d4c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926d4b00> <<< 15494 1726853332.18945: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853332.18955: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd926d4ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92642d20> <<< 15494 1726853332.18998: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853332.19011: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 15494 1726853332.19044: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 15494 1726853332.19055: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926d55b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926d5280> import 'importlib.machinery' # <<< 15494 1726853332.19104: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 15494 1726853332.19123: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926d64b0> import 'importlib.util' # import 'runpy' # <<< 15494 1726853332.19144: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 15494 1726853332.19208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 15494 1726853332.19213: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 15494 1726853332.19228: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926ec680> import 'errno' # <<< 15494 1726853332.19249: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853332.19283: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd926edd30> <<< 15494 1726853332.19309: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 15494 1726853332.19337: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 15494 1726853332.19360: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926eebd0> <<< 15494 1726853332.19374: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd926ef230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926ee120> <<< 15494 1726853332.19397: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 15494 1726853332.19411: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15494 1726853332.19454: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853332.19465: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd926efcb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926ef3e0> <<< 15494 1726853332.19495: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926d6450> <<< 15494 1726853332.19518: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 15494 1726853332.19562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 15494 1726853332.19566: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 15494 1726853332.19581: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 15494 1726853332.19610: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd923ebbc0> <<< 15494 1726853332.19641: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 15494 1726853332.19673: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd92414710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92414470> <<< 15494 1726853332.19714: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd924146b0> <<< 15494 1726853332.19725: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15494 1726853332.19799: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853332.19911: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd92414fe0> <<< 15494 1726853332.20024: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853332.20029: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd92415910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92414890> <<< 15494 1726853332.20050: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd923e9d60> <<< 15494 1726853332.20068: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15494 1726853332.20090: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15494 1726853332.20125: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 15494 1726853332.20131: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92416cc0> <<< 15494 1726853332.20150: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92415790> <<< 15494 1726853332.20173: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926d6ba0> <<< 15494 1726853332.20193: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15494 1726853332.20256: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853332.20275: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15494 1726853332.20307: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15494 1726853332.20332: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92443020> <<< 15494 1726853332.20389: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 15494 1726853332.20394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853332.20417: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 15494 1726853332.20439: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15494 1726853332.20477: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd924633e0> <<< 15494 1726853332.20496: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15494 1726853332.20540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15494 1726853332.20586: stdout chunk (state=3): >>>import 'ntpath' # <<< 15494 1726853332.20611: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd924c4200> <<< 15494 1726853332.20637: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15494 1726853332.20658: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15494 1726853332.20687: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15494 1726853332.20719: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15494 1726853332.20806: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd924c6960> <<< 15494 1726853332.20876: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd924c4320> <<< 15494 1726853332.20912: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd924911f0> <<< 15494 1726853332.20934: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91d252e0> <<< 15494 1726853332.20954: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd924621e0> <<< 15494 1726853332.20958: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92417bf0> <<< 15494 1726853332.21123: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15494 1726853332.21140: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcd92462300> <<< 15494 1726853332.21413: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_xq4si43g/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 15494 1726853332.21773: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15494 1726853332.21777: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91d8af90> import '_typing' # <<< 15494 1726853332.21905: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91d69e80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91d69040> # zipimport: zlib available <<< 15494 1726853332.21941: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 15494 1726853332.21968: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15494 1726853332.22000: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 15494 1726853332.23396: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.24612: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91d88e60> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 15494 1726853332.24630: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 15494 1726853332.24640: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91dbe930> <<< 15494 1726853332.24678: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91dbe720> <<< 15494 1726853332.24727: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91dbe030> <<< 15494 1726853332.24804: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 15494 1726853332.24837: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91dbea50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91d8bc20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91dbf680> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853332.24859: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91dbf8c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15494 1726853332.24935: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 15494 1726853332.24981: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91dbfe00> import 'pwd' # <<< 15494 1726853332.25038: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15494 1726853332.25066: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c25b80> <<< 15494 1726853332.25089: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c277a0> <<< 15494 1726853332.25114: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15494 1726853332.25156: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c281a0> <<< 15494 1726853332.25251: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15494 1726853332.25464: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c29340> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 15494 1726853332.25468: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c2bdd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c28110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c2a0f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 15494 1726853332.25499: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15494 1726853332.25591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15494 1726853332.25613: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c33c80> <<< 15494 1726853332.25691: stdout chunk (state=3): >>>import '_tokenize' # <<< 15494 1726853332.25723: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c32750> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c324b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15494 1726853332.26003: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c32a20> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c2a5a0> <<< 15494 1726853332.26034: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c77f20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c780b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c79b80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c79940> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15494 1726853332.26086: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15494 1726853332.26117: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c7c110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c7a270> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15494 1726853332.26153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853332.26178: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 15494 1726853332.26288: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c7f860> <<< 15494 1726853332.26495: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c7c230> <<< 15494 1726853332.26535: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c80890> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c80a40> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c80920> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c78290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 15494 1726853332.26549: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15494 1726853332.26588: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853332.26592: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91b0c0e0> <<< 15494 1726853332.26736: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853332.26759: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91b0d430> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c82870> <<< 15494 1726853332.26793: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c83c20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c824e0> # zipimport: zlib available <<< 15494 1726853332.26845: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 15494 1726853332.26860: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.26919: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.26998: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.27062: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 15494 1726853332.27088: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 15494 1726853332.27316: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.27319: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.27817: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.28346: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 15494 1726853332.28375: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15494 1726853332.28401: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853332.28449: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853332.28460: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91b11610> <<< 15494 1726853332.28527: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 15494 1726853332.28539: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91b12360> <<< 15494 1726853332.28564: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91b0d5e0> <<< 15494 1726853332.28610: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 15494 1726853332.28635: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.28648: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 15494 1726853332.28666: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.28801: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.28964: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91b12ab0> <<< 15494 1726853332.28981: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.29435: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.30100: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 15494 1726853332.30149: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.30231: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15494 1726853332.30254: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15494 1726853332.30277: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 15494 1726853332.30319: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.30352: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 15494 1726853332.30591: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.30802: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15494 1726853332.30897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 15494 1726853332.30931: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91b136b0> <<< 15494 1726853332.30954: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.31016: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.31085: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 15494 1726853332.31116: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 15494 1726853332.31227: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 15494 1726853332.31252: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.31299: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.31350: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.31434: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15494 1726853332.31464: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853332.31650: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91b1e090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91b19040> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 15494 1726853332.31813: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15494 1726853332.31817: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853332.31834: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15494 1726853332.31866: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 15494 1726853332.31887: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15494 1726853332.31935: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 15494 1726853332.31981: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 15494 1726853332.31992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15494 1726853332.32108: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c06bd0> <<< 15494 1726853332.32111: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91cfe8d0> <<< 15494 1726853332.32141: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91b1e3c0> <<< 15494 1726853332.32215: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91b1e180> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available <<< 15494 1726853332.32227: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15494 1726853332.32314: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 15494 1726853332.32329: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.32413: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.32533: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15494 1726853332.32580: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.32624: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.32674: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 15494 1726853332.32729: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.32870: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 15494 1726853332.33039: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.33208: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.33252: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.33314: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853332.33339: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 15494 1726853332.33364: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 15494 1726853332.33405: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 15494 1726853332.33408: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 15494 1726853332.33448: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91bb2300> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 15494 1726853332.33472: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 15494 1726853332.33476: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 15494 1726853332.33526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 15494 1726853332.33552: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 15494 1726853332.33566: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9180c260> <<< 15494 1726853332.33601: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853332.33617: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd9180c5c0> <<< 15494 1726853332.33656: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91b98590> <<< 15494 1726853332.33703: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91bb2ea0> <<< 15494 1726853332.33707: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91bb09e0> <<< 15494 1726853332.33735: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91bb0620> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 15494 1726853332.33805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 15494 1726853332.33847: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 15494 1726853332.33864: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 15494 1726853332.33897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 15494 1726853332.33931: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd9180f5f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9180eea0> <<< 15494 1726853332.33963: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd9180f080> <<< 15494 1726853332.33966: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9180e300> <<< 15494 1726853332.33990: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 15494 1726853332.34089: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 15494 1726853332.34107: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9180f6e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 15494 1726853332.34134: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 15494 1726853332.34168: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd918721e0> <<< 15494 1726853332.34220: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91870200> <<< 15494 1726853332.34250: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91bb0530> import 'ansible.module_utils.facts.timeout' # <<< 15494 1726853332.34284: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 15494 1726853332.34303: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 15494 1726853332.34324: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.34360: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.34426: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 15494 1726853332.34440: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.34487: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.34554: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 15494 1726853332.34583: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 15494 1726853332.34601: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.34616: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.34652: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 15494 1726853332.34662: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.34704: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.34766: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 15494 1726853332.34847: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.34870: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 15494 1726853332.35002: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15494 1726853332.35083: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.35120: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 15494 1726853332.35706: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.36057: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 15494 1726853332.36085: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.36168: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15494 1726853332.36206: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 15494 1726853332.36223: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.36256: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.36288: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 15494 1726853332.36300: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.36355: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.36403: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 15494 1726853332.36422: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.36455: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.36494: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 15494 1726853332.36524: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.36567: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 15494 1726853332.36570: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.36647: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.36736: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 15494 1726853332.36759: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91873710> <<< 15494 1726853332.36786: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 15494 1726853332.36817: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 15494 1726853332.36940: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91872cf0> import 'ansible.module_utils.facts.system.local' # <<< 15494 1726853332.36957: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.37010: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.37084: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 15494 1726853332.37097: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.37178: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.37278: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 15494 1726853332.37288: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.37342: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.37423: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 15494 1726853332.37441: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.37467: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.37516: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 15494 1726853332.37563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15494 1726853332.37631: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853332.37698: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd918ae2a0> <<< 15494 1726853332.37896: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91873a40> import 'ansible.module_utils.facts.system.python' # <<< 15494 1726853332.37912: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.37955: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.38021: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 15494 1726853332.38026: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.38103: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.38187: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.38299: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.38460: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 15494 1726853332.38465: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.38499: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.38553: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 15494 1726853332.38557: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.38596: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.38644: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 15494 1726853332.38709: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853332.38716: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd918c1e50> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd918c36b0> import 'ansible.module_utils.facts.system.user' # <<< 15494 1726853332.38760: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 15494 1726853332.38763: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.38798: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.38858: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 15494 1726853332.38861: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.39003: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.39159: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 15494 1726853332.39268: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.39366: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.39407: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.39456: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 15494 1726853332.39483: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.39509: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.39519: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.39655: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.39804: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 15494 1726853332.39816: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.39936: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.40075: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 15494 1726853332.40084: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.40099: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.40133: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.40689: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.41207: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 15494 1726853332.41211: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.41314: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.41431: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 15494 1726853332.41438: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.41526: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.41635: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 15494 1726853332.41639: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.41788: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.41965: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 15494 1726853332.41974: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.41994: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 15494 1726853332.42033: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.42087: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 15494 1726853332.42090: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.42183: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.42284: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.42488: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.42693: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 15494 1726853332.42714: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.42742: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.42801: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 15494 1726853332.42807: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.42844: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.42850: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 15494 1726853332.42865: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.42922: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.43003: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 15494 1726853332.43015: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.43043: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.43078: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 15494 1726853332.43081: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.43204: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.43207: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 15494 1726853332.43338: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.43341: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 15494 1726853332.43698: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.43842: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 15494 1726853332.43858: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.43909: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.43966: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 15494 1726853332.43997: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.44016: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.44061: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 15494 1726853332.44064: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.44091: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.44135: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 15494 1726853332.44300: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.44313: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 15494 1726853332.44376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 15494 1726853332.44417: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 15494 1726853332.44429: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.44465: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.44514: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 15494 1726853332.44542: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15494 1726853332.44564: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.44659: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.44673: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.44737: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.44802: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 15494 1726853332.44825: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 15494 1726853332.44876: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.44925: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 15494 1726853332.44942: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.45131: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.45333: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 15494 1726853332.45344: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.45397: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.45430: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 15494 1726853332.45484: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.45542: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 15494 1726853332.45547: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.45626: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.45705: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 15494 1726853332.45727: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.45803: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.45897: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 15494 1726853332.45981: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853332.46186: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 15494 1726853332.46212: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 15494 1726853332.46255: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd916636e0> <<< 15494 1726853332.46274: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd916600b0> <<< 15494 1726853332.46329: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91662cc0> <<< 15494 1726853332.60314: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 15494 1726853332.60318: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 15494 1726853332.60367: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd916aa2a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 15494 1726853332.60383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 15494 1726853332.60395: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd916a8c20> <<< 15494 1726853332.60449: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 15494 1726853332.60492: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 15494 1726853332.60514: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd916aa270> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd916a9d30> <<< 15494 1726853332.60773: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 15494 1726853332.85125: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "28", "second": "52", "epoch": "1726853332", "epoch_int": "1726853332", "date": "2024-09-20", "time": "13:28:52", "iso8601_micro": "2024-09-20T17:28:52.461220Z", "iso8601": "2024-09-20T17:28:52Z", "iso8601_basic": "20240920T132852461220", "iso8601_basic_short": "20240920T132852", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2958, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 573, "free": 2958}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 498, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797347328, "block_size": 4096, "block_total": 65519099, "block_available": 63915368, "block_used": 1603731, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.62841796875, "5m": 0.34765625, "15m": 0.14892578125}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_local": {}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15494 1726853332.85821: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout <<< 15494 1726853332.85830: stdout chunk (state=3): >>># restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random <<< 15494 1726853332.85986: stdout chunk (state=3): >>># cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn <<< 15494 1726853332.86110: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 15494 1726853332.86388: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15494 1726853332.86438: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 15494 1726853332.86576: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 15494 1726853332.86612: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 15494 1726853332.86642: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 15494 1726853332.86734: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle <<< 15494 1726853332.86785: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 15494 1726853332.86910: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 15494 1726853332.86979: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 15494 1726853332.87059: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 15494 1726853332.87473: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 15494 1726853332.87477: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15494 1726853332.87575: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 15494 1726853332.87611: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 15494 1726853332.87667: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 15494 1726853332.87774: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 15494 1726853332.88134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853332.88143: stdout chunk (state=3): >>><<< 15494 1726853332.88157: stderr chunk (state=3): >>><<< 15494 1726853332.88892: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd928184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd927e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9281aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92609130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92609fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92647dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92647fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9267f800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9267fe90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9265faa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9265d1c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92644f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9269f6e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9269e300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9265e060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92646e70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926d47a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92644200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd926d4c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926d4b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd926d4ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92642d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926d55b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926d5280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926d64b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926ec680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd926edd30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926eebd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd926ef230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926ee120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd926efcb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926ef3e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926d6450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd923ebbc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd92414710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92414470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd924146b0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd92414fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd92415910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92414890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd923e9d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92416cc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92415790> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd926d6ba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92443020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd924633e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd924c4200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd924c6960> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd924c4320> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd924911f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91d252e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd924621e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd92417bf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcd92462300> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_xq4si43g/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91d8af90> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91d69e80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91d69040> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91d88e60> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91dbe930> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91dbe720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91dbe030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91dbea50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91d8bc20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91dbf680> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91dbf8c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91dbfe00> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c25b80> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c277a0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c281a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c29340> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c2bdd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c28110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c2a0f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c33c80> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c32750> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c324b0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c32a20> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c2a5a0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c77f20> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c780b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c79b80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c79940> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c7c110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c7a270> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c7f860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c7c230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c80890> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c80a40> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c80920> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c78290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91b0c0e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91b0d430> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c82870> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91c83c20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c824e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91b11610> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91b12360> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91b0d5e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91b12ab0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91b136b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd91b1e090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91b19040> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91c06bd0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91cfe8d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91b1e3c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91b1e180> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91bb2300> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9180c260> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd9180c5c0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91b98590> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91bb2ea0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91bb09e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91bb0620> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd9180f5f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9180eea0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd9180f080> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9180e300> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd9180f6e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd918721e0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91870200> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91bb0530> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91873710> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91872cf0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd918ae2a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91873a40> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd918c1e50> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd918c36b0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd916636e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd916600b0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd91662cc0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd916aa2a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd916a8c20> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd916aa270> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd916a9d30> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "28", "second": "52", "epoch": "1726853332", "epoch_int": "1726853332", "date": "2024-09-20", "time": "13:28:52", "iso8601_micro": "2024-09-20T17:28:52.461220Z", "iso8601": "2024-09-20T17:28:52Z", "iso8601_basic": "20240920T132852461220", "iso8601_basic_short": "20240920T132852", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2958, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 573, "free": 2958}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 498, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797347328, "block_size": 4096, "block_total": 65519099, "block_available": 63915368, "block_used": 1603731, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.62841796875, "5m": 0.34765625, "15m": 0.14892578125}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_local": {}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 15494 1726853332.93666: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853331.4633284-15530-51624114495273/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853332.93670: _low_level_execute_command(): starting 15494 1726853332.93675: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853331.4633284-15530-51624114495273/ > /dev/null 2>&1 && sleep 0' 15494 1726853332.94756: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853332.94774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853332.94778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853332.95181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853332.95212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853332.97376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853332.97380: stdout chunk (state=3): >>><<< 15494 1726853332.97383: stderr chunk (state=3): >>><<< 15494 1726853332.97385: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853332.97387: handler run complete 15494 1726853332.97516: variable 'ansible_facts' from source: unknown 15494 1726853332.97785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853332.98341: variable 'ansible_facts' from source: unknown 15494 1726853332.98561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853332.98729: attempt loop complete, returning result 15494 1726853332.98732: _execute() done 15494 1726853332.98851: dumping result to json 15494 1726853332.98928: done dumping result, returning 15494 1726853332.98931: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-0028-1a50-00000000007e] 15494 1726853332.98934: sending task result for task 02083763-bbaf-0028-1a50-00000000007e 15494 1726853333.00298: done sending task result for task 02083763-bbaf-0028-1a50-00000000007e 15494 1726853333.00302: WORKER PROCESS EXITING ok: [managed_node1] 15494 1726853333.00803: no more pending results, returning what we have 15494 1726853333.00806: results queue empty 15494 1726853333.00807: checking for any_errors_fatal 15494 1726853333.00808: done checking for any_errors_fatal 15494 1726853333.00809: checking for max_fail_percentage 15494 1726853333.00810: done checking for max_fail_percentage 15494 1726853333.00811: checking to see if all hosts have failed and the running result is not ok 15494 1726853333.00812: done checking to see if all hosts have failed 15494 1726853333.00813: getting the remaining hosts for this loop 15494 1726853333.00814: done getting the remaining hosts for this loop 15494 1726853333.00818: getting the next task for host managed_node1 15494 1726853333.00824: done getting next task for host managed_node1 15494 1726853333.00826: ^ task is: TASK: meta (flush_handlers) 15494 1726853333.00828: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853333.00832: getting variables 15494 1726853333.00833: in VariableManager get_vars() 15494 1726853333.00853: Calling all_inventory to load vars for managed_node1 15494 1726853333.00856: Calling groups_inventory to load vars for managed_node1 15494 1726853333.00859: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853333.00868: Calling all_plugins_play to load vars for managed_node1 15494 1726853333.00984: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853333.00990: Calling groups_plugins_play to load vars for managed_node1 15494 1726853333.01364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853333.01877: done with get_vars() 15494 1726853333.01887: done getting variables 15494 1726853333.01942: in VariableManager get_vars() 15494 1726853333.01951: Calling all_inventory to load vars for managed_node1 15494 1726853333.01953: Calling groups_inventory to load vars for managed_node1 15494 1726853333.01955: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853333.02075: Calling all_plugins_play to load vars for managed_node1 15494 1726853333.02078: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853333.02081: Calling groups_plugins_play to load vars for managed_node1 15494 1726853333.02327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853333.02757: done with get_vars() 15494 1726853333.02769: done queuing things up, now waiting for results queue to drain 15494 1726853333.02775: results queue empty 15494 1726853333.02776: checking for any_errors_fatal 15494 1726853333.02778: done checking for any_errors_fatal 15494 1726853333.02779: checking for max_fail_percentage 15494 1726853333.02780: done checking for max_fail_percentage 15494 1726853333.02780: checking to see if all hosts have failed and the running result is not ok 15494 1726853333.02781: done checking to see if all hosts have failed 15494 1726853333.02787: getting the remaining hosts for this loop 15494 1726853333.02788: done getting the remaining hosts for this loop 15494 1726853333.02790: getting the next task for host managed_node1 15494 1726853333.02795: done getting next task for host managed_node1 15494 1726853333.02797: ^ task is: TASK: Include the task 'el_repo_setup.yml' 15494 1726853333.02799: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853333.02801: getting variables 15494 1726853333.02801: in VariableManager get_vars() 15494 1726853333.02809: Calling all_inventory to load vars for managed_node1 15494 1726853333.02811: Calling groups_inventory to load vars for managed_node1 15494 1726853333.02813: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853333.02818: Calling all_plugins_play to load vars for managed_node1 15494 1726853333.02820: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853333.02823: Calling groups_plugins_play to load vars for managed_node1 15494 1726853333.03069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853333.03580: done with get_vars() 15494 1726853333.03587: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:11 Friday 20 September 2024 13:28:53 -0400 (0:00:01.630) 0:00:01.653 ****** 15494 1726853333.03779: entering _queue_task() for managed_node1/include_tasks 15494 1726853333.03781: Creating lock for include_tasks 15494 1726853333.04343: worker is 1 (out of 1 available) 15494 1726853333.04354: exiting _queue_task() for managed_node1/include_tasks 15494 1726853333.04367: done queuing things up, now waiting for results queue to drain 15494 1726853333.04368: waiting for pending results... 15494 1726853333.05187: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 15494 1726853333.05192: in run() - task 02083763-bbaf-0028-1a50-000000000006 15494 1726853333.05195: variable 'ansible_search_path' from source: unknown 15494 1726853333.05198: calling self._execute() 15494 1726853333.05776: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853333.05781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853333.05783: variable 'omit' from source: magic vars 15494 1726853333.05786: _execute() done 15494 1726853333.05788: dumping result to json 15494 1726853333.05791: done dumping result, returning 15494 1726853333.05793: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [02083763-bbaf-0028-1a50-000000000006] 15494 1726853333.05796: sending task result for task 02083763-bbaf-0028-1a50-000000000006 15494 1726853333.05918: no more pending results, returning what we have 15494 1726853333.05926: in VariableManager get_vars() 15494 1726853333.05959: Calling all_inventory to load vars for managed_node1 15494 1726853333.05962: Calling groups_inventory to load vars for managed_node1 15494 1726853333.05966: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853333.05981: Calling all_plugins_play to load vars for managed_node1 15494 1726853333.05985: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853333.05988: Calling groups_plugins_play to load vars for managed_node1 15494 1726853333.06279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853333.06688: done with get_vars() 15494 1726853333.06696: variable 'ansible_search_path' from source: unknown 15494 1726853333.06710: we have included files to process 15494 1726853333.06711: generating all_blocks data 15494 1726853333.06713: done generating all_blocks data 15494 1726853333.06713: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15494 1726853333.06715: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15494 1726853333.06717: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15494 1726853333.07258: done sending task result for task 02083763-bbaf-0028-1a50-000000000006 15494 1726853333.07262: WORKER PROCESS EXITING 15494 1726853333.08189: in VariableManager get_vars() 15494 1726853333.08204: done with get_vars() 15494 1726853333.08215: done processing included file 15494 1726853333.08217: iterating over new_blocks loaded from include file 15494 1726853333.08218: in VariableManager get_vars() 15494 1726853333.08229: done with get_vars() 15494 1726853333.08230: filtering new block on tags 15494 1726853333.08357: done filtering new block on tags 15494 1726853333.08361: in VariableManager get_vars() 15494 1726853333.08373: done with get_vars() 15494 1726853333.08374: filtering new block on tags 15494 1726853333.08390: done filtering new block on tags 15494 1726853333.08392: in VariableManager get_vars() 15494 1726853333.08400: done with get_vars() 15494 1726853333.08402: filtering new block on tags 15494 1726853333.08413: done filtering new block on tags 15494 1726853333.08415: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 15494 1726853333.08420: extending task lists for all hosts with included blocks 15494 1726853333.08464: done extending task lists 15494 1726853333.08465: done processing included files 15494 1726853333.08466: results queue empty 15494 1726853333.08466: checking for any_errors_fatal 15494 1726853333.08468: done checking for any_errors_fatal 15494 1726853333.08469: checking for max_fail_percentage 15494 1726853333.08470: done checking for max_fail_percentage 15494 1726853333.08472: checking to see if all hosts have failed and the running result is not ok 15494 1726853333.08585: done checking to see if all hosts have failed 15494 1726853333.08586: getting the remaining hosts for this loop 15494 1726853333.08587: done getting the remaining hosts for this loop 15494 1726853333.08590: getting the next task for host managed_node1 15494 1726853333.08595: done getting next task for host managed_node1 15494 1726853333.08597: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 15494 1726853333.08600: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853333.08602: getting variables 15494 1726853333.08603: in VariableManager get_vars() 15494 1726853333.08611: Calling all_inventory to load vars for managed_node1 15494 1726853333.08613: Calling groups_inventory to load vars for managed_node1 15494 1726853333.08616: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853333.08621: Calling all_plugins_play to load vars for managed_node1 15494 1726853333.08624: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853333.08627: Calling groups_plugins_play to load vars for managed_node1 15494 1726853333.08870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853333.09369: done with get_vars() 15494 1726853333.09380: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:28:53 -0400 (0:00:00.056) 0:00:01.710 ****** 15494 1726853333.09442: entering _queue_task() for managed_node1/setup 15494 1726853333.10058: worker is 1 (out of 1 available) 15494 1726853333.10070: exiting _queue_task() for managed_node1/setup 15494 1726853333.10437: done queuing things up, now waiting for results queue to drain 15494 1726853333.10439: waiting for pending results... 15494 1726853333.10573: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 15494 1726853333.10795: in run() - task 02083763-bbaf-0028-1a50-00000000008f 15494 1726853333.10815: variable 'ansible_search_path' from source: unknown 15494 1726853333.10823: variable 'ansible_search_path' from source: unknown 15494 1726853333.10983: calling self._execute() 15494 1726853333.11066: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853333.11110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853333.11189: variable 'omit' from source: magic vars 15494 1726853333.12307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853333.16673: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853333.16826: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853333.16999: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853333.17049: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853333.17083: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853333.17285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853333.17323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853333.17352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853333.17425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853333.17509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853333.17886: variable 'ansible_facts' from source: unknown 15494 1726853333.18012: variable 'network_test_required_facts' from source: task vars 15494 1726853333.18121: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 15494 1726853333.18132: variable 'omit' from source: magic vars 15494 1726853333.18229: variable 'omit' from source: magic vars 15494 1726853333.18321: variable 'omit' from source: magic vars 15494 1726853333.18349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853333.18435: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853333.18457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853333.18534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853333.18550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853333.18675: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853333.18680: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853333.18683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853333.18844: Set connection var ansible_connection to ssh 15494 1726853333.19061: Set connection var ansible_pipelining to False 15494 1726853333.19064: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853333.19067: Set connection var ansible_shell_type to sh 15494 1726853333.19069: Set connection var ansible_timeout to 10 15494 1726853333.19073: Set connection var ansible_shell_executable to /bin/sh 15494 1726853333.19075: variable 'ansible_shell_executable' from source: unknown 15494 1726853333.19077: variable 'ansible_connection' from source: unknown 15494 1726853333.19079: variable 'ansible_module_compression' from source: unknown 15494 1726853333.19081: variable 'ansible_shell_type' from source: unknown 15494 1726853333.19084: variable 'ansible_shell_executable' from source: unknown 15494 1726853333.19086: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853333.19090: variable 'ansible_pipelining' from source: unknown 15494 1726853333.19092: variable 'ansible_timeout' from source: unknown 15494 1726853333.19094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853333.19275: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853333.19476: variable 'omit' from source: magic vars 15494 1726853333.19479: starting attempt loop 15494 1726853333.19482: running the handler 15494 1726853333.19484: _low_level_execute_command(): starting 15494 1726853333.19487: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853333.21150: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853333.21154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853333.21269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853333.22941: stdout chunk (state=3): >>>/root <<< 15494 1726853333.22983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853333.23151: stderr chunk (state=3): >>><<< 15494 1726853333.23155: stdout chunk (state=3): >>><<< 15494 1726853333.23178: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853333.23302: _low_level_execute_command(): starting 15494 1726853333.23306: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853333.2317913-15617-243866696185059 `" && echo ansible-tmp-1726853333.2317913-15617-243866696185059="` echo /root/.ansible/tmp/ansible-tmp-1726853333.2317913-15617-243866696185059 `" ) && sleep 0' 15494 1726853333.24276: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853333.24478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853333.24481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853333.24484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853333.24486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853333.24488: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853333.24491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853333.24493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853333.24495: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 15494 1726853333.24496: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15494 1726853333.24498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853333.24500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853333.24502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853333.24509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853333.24648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853333.24766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853333.24820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853333.26781: stdout chunk (state=3): >>>ansible-tmp-1726853333.2317913-15617-243866696185059=/root/.ansible/tmp/ansible-tmp-1726853333.2317913-15617-243866696185059 <<< 15494 1726853333.26876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853333.26880: stdout chunk (state=3): >>><<< 15494 1726853333.26882: stderr chunk (state=3): >>><<< 15494 1726853333.26884: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853333.2317913-15617-243866696185059=/root/.ansible/tmp/ansible-tmp-1726853333.2317913-15617-243866696185059 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853333.26912: variable 'ansible_module_compression' from source: unknown 15494 1726853333.27009: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15494 1726853333.27117: variable 'ansible_facts' from source: unknown 15494 1726853333.27554: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853333.2317913-15617-243866696185059/AnsiballZ_setup.py 15494 1726853333.27902: Sending initial data 15494 1726853333.27905: Sent initial data (154 bytes) 15494 1726853333.29110: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853333.29142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853333.29380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853333.29414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853333.30936: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15494 1726853333.30943: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15494 1726853333.30950: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 15494 1726853333.31010: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853333.31054: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853333.31210: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpfm8b1l2y /root/.ansible/tmp/ansible-tmp-1726853333.2317913-15617-243866696185059/AnsiballZ_setup.py <<< 15494 1726853333.31214: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853333.2317913-15617-243866696185059/AnsiballZ_setup.py" <<< 15494 1726853333.31218: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpfm8b1l2y" to remote "/root/.ansible/tmp/ansible-tmp-1726853333.2317913-15617-243866696185059/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853333.2317913-15617-243866696185059/AnsiballZ_setup.py" <<< 15494 1726853333.33931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853333.33938: stdout chunk (state=3): >>><<< 15494 1726853333.33941: stderr chunk (state=3): >>><<< 15494 1726853333.33943: done transferring module to remote 15494 1726853333.34127: _low_level_execute_command(): starting 15494 1726853333.34131: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853333.2317913-15617-243866696185059/ /root/.ansible/tmp/ansible-tmp-1726853333.2317913-15617-243866696185059/AnsiballZ_setup.py && sleep 0' 15494 1726853333.35352: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853333.35366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853333.35467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853333.35537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853333.37454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853333.37465: stdout chunk (state=3): >>><<< 15494 1726853333.37479: stderr chunk (state=3): >>><<< 15494 1726853333.37498: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853333.37511: _low_level_execute_command(): starting 15494 1726853333.37520: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853333.2317913-15617-243866696185059/AnsiballZ_setup.py && sleep 0' 15494 1726853333.38824: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853333.38837: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853333.38883: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853333.39007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853333.39018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853333.39144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853333.39307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853333.41445: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15494 1726853333.41494: stdout chunk (state=3): >>>import _imp # builtin <<< 15494 1726853333.41517: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 15494 1726853333.41617: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 15494 1726853333.41632: stdout chunk (state=3): >>>import 'posix' # <<< 15494 1726853333.41704: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 15494 1726853333.41752: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853333.41773: stdout chunk (state=3): >>>import '_codecs' # <<< 15494 1726853333.41808: stdout chunk (state=3): >>>import 'codecs' # <<< 15494 1726853333.41833: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 15494 1726853333.41861: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 15494 1726853333.41877: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0b0184d0> <<< 15494 1726853333.41880: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0afe7b30> <<< 15494 1726853333.41910: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 15494 1726853333.41913: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 15494 1726853333.41925: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0b01aa50> <<< 15494 1726853333.42194: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages <<< 15494 1726853333.42239: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 15494 1726853333.42245: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 15494 1726853333.42248: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 15494 1726853333.42287: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0adc9130> <<< 15494 1726853333.42386: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 15494 1726853333.42389: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853333.42401: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0adc9fa0> import 'site' # <<< 15494 1726853333.42414: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15494 1726853333.42792: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15494 1726853333.42911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15494 1726853333.43010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae07e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 15494 1726853333.43024: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae07f50> <<< 15494 1726853333.43043: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15494 1726853333.43068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15494 1726853333.43093: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15494 1726853333.43156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 15494 1726853333.43190: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae3f890> <<< 15494 1726853333.43266: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 15494 1726853333.43269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae3ff20> import '_collections' # <<< 15494 1726853333.43456: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae1fb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae1d280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae05040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15494 1726853333.43488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 15494 1726853333.43491: stdout chunk (state=3): >>>import '_sre' # <<< 15494 1726853333.43557: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15494 1726853333.43561: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 15494 1726853333.43645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 15494 1726853333.43648: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae5f800> <<< 15494 1726853333.43660: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae5e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae1e150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae5cc80> <<< 15494 1726853333.43702: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 15494 1726853333.43718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae94890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae042c0> <<< 15494 1726853333.43766: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 15494 1726853333.43794: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853333.43867: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0ae94d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae94bf0> <<< 15494 1726853333.43997: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0ae94fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae02de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae956d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae953a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae965d0> <<< 15494 1726853333.44022: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 15494 1726853333.44052: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 15494 1726853333.44087: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 15494 1726853333.44112: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0aeac7a0> <<< 15494 1726853333.44128: stdout chunk (state=3): >>>import 'errno' # <<< 15494 1726853333.44161: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853333.44198: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0aeadeb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 15494 1726853333.44248: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 15494 1726853333.44251: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0aeaed50> <<< 15494 1726853333.44302: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0aeaf380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0aeae2a0> <<< 15494 1726853333.44355: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15494 1726853333.44402: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853333.44405: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0aeafe00> <<< 15494 1726853333.44407: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0aeaf530> <<< 15494 1726853333.44460: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae96570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 15494 1726853333.44482: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 15494 1726853333.44520: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 15494 1726853333.44619: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0aba7ce0> <<< 15494 1726853333.44634: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 15494 1726853333.45000: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0abd0740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0abd04a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0abd0770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0abd10a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0abd1a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0abd0950> <<< 15494 1726853333.45011: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0aba5e80> <<< 15494 1726853333.45038: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15494 1726853333.45062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15494 1726853333.45112: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0abd2e10> <<< 15494 1726853333.45123: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0abd18e0> <<< 15494 1726853333.45147: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae96cc0> <<< 15494 1726853333.45232: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15494 1726853333.45235: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853333.45251: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15494 1726853333.45318: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15494 1726853333.45328: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0abfb170> <<< 15494 1726853333.45388: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853333.45435: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15494 1726853333.45468: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ac1f4d0> <<< 15494 1726853333.45547: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15494 1726853333.45550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15494 1726853333.45591: stdout chunk (state=3): >>>import 'ntpath' # <<< 15494 1726853333.45611: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ac802f0> <<< 15494 1726853333.45661: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15494 1726853333.45665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15494 1726853333.45728: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15494 1726853333.45731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15494 1726853333.45859: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ac82a20> <<< 15494 1726853333.45938: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ac803e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ac452e0> <<< 15494 1726853333.45951: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a5293d0> <<< 15494 1726853333.45975: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ac1e300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0abd3d40> <<< 15494 1726853333.46192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2b0ac1e660> <<< 15494 1726853333.46432: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_p8n5zfj0/ansible_setup_payload.zip' <<< 15494 1726853333.46435: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.46550: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.46784: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 15494 1726853333.46856: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a5930e0> import '_typing' # <<< 15494 1726853333.46929: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a571fd0> <<< 15494 1726853333.46941: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a571160> # zipimport: zlib available <<< 15494 1726853333.47015: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15494 1726853333.47029: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 15494 1726853333.47041: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.48444: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.49594: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a591760> <<< 15494 1726853333.49627: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853333.49652: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 15494 1726853333.49688: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 15494 1726853333.49721: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a5c2a80> <<< 15494 1726853333.49766: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a5c2810> <<< 15494 1726853333.49802: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a5c2120> <<< 15494 1726853333.49827: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 15494 1726853333.49925: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a5c2b40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0b01a9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a5c37a0> <<< 15494 1726853333.49951: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a5c3980> <<< 15494 1726853333.49966: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15494 1726853333.50020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 15494 1726853333.50079: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a5c3ec0> <<< 15494 1726853333.50128: stdout chunk (state=3): >>>import 'pwd' # <<< 15494 1726853333.50139: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15494 1726853333.50215: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a42dc40> <<< 15494 1726853333.50235: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a42f830> <<< 15494 1726853333.50277: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 15494 1726853333.50298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15494 1726853333.50301: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a430230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15494 1726853333.50352: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 15494 1726853333.50358: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a4313d0> <<< 15494 1726853333.50483: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 15494 1726853333.50486: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a433ec0> <<< 15494 1726853333.50525: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0ae02ed0> <<< 15494 1726853333.50545: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a432180> <<< 15494 1726853333.50570: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15494 1726853333.50695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 15494 1726853333.50699: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15494 1726853333.50773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15494 1726853333.50817: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 15494 1726853333.50820: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a43bda0> import '_tokenize' # <<< 15494 1726853333.50892: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a43a870> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a43a5d0> <<< 15494 1726853333.50977: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15494 1726853333.50991: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a43ab40> <<< 15494 1726853333.51420: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a432690> <<< 15494 1726853333.51453: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a47fa40> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a4801d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a481c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a481a00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a484110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a4822d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a487860> <<< 15494 1726853333.51575: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a484230> <<< 15494 1726853333.51638: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a488650> <<< 15494 1726853333.51668: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a488680> <<< 15494 1726853333.51710: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a488b90> <<< 15494 1726853333.51735: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a480380> <<< 15494 1726853333.51761: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 15494 1726853333.51788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15494 1726853333.51814: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853333.51836: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a3141d0> <<< 15494 1726853333.51992: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a315460> <<< 15494 1726853333.52021: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a48a960> <<< 15494 1726853333.52057: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a48bd10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a48a600> # zipimport: zlib available <<< 15494 1726853333.52170: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 15494 1726853333.52191: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.52259: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.52302: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 15494 1726853333.52326: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 15494 1726853333.52439: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.52575: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.53113: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.53663: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 15494 1726853333.53713: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15494 1726853333.53753: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853333.53765: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a319610> <<< 15494 1726853333.53868: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 15494 1726853333.53996: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a31a480> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a315580> <<< 15494 1726853333.54000: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 15494 1726853333.54117: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.54272: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 15494 1726853333.54300: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a31a4e0> # zipimport: zlib available <<< 15494 1726853333.54753: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.55192: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.55285: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.55349: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 15494 1726853333.55388: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.55418: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 15494 1726853333.55500: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15494 1726853333.55620: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 15494 1726853333.55722: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 15494 1726853333.55931: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.56159: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15494 1726853333.56267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 15494 1726853333.56313: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a31b620> # zipimport: zlib available <<< 15494 1726853333.56503: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 15494 1726853333.56506: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.56534: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.56718: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15494 1726853333.56730: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.56792: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15494 1726853333.56833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853333.56925: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a326120> <<< 15494 1726853333.56960: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a323e00> <<< 15494 1726853333.56999: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 15494 1726853333.57038: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.57066: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.57194: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.57212: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853333.57247: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15494 1726853333.57280: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 15494 1726853333.57313: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15494 1726853333.57349: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 15494 1726853333.57401: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15494 1726853333.57419: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a40ea50> <<< 15494 1726853333.57479: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a5ee720> <<< 15494 1726853333.57544: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a326240> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a488c80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 15494 1726853333.57697: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.57702: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15494 1726853333.57729: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 15494 1726853333.57766: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.57860: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.57875: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15494 1726853333.57943: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.57959: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.57987: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.58041: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 15494 1726853333.58043: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.58125: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.58175: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.58198: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.58238: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 15494 1726853333.58254: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.58416: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.58593: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.58626: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.58691: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853333.58716: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 15494 1726853333.58741: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 15494 1726853333.58763: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 15494 1726853333.58798: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a3b6120> <<< 15494 1726853333.58825: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 15494 1726853333.58842: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 15494 1726853333.58886: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 15494 1726853333.58925: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 15494 1726853333.58928: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 15494 1726853333.58965: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b09fcff50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853333.58969: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b09fd4440> <<< 15494 1726853333.59027: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a39eff0> <<< 15494 1726853333.59048: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a3b6c90> <<< 15494 1726853333.59074: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a3b4800> <<< 15494 1726853333.59095: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a3b43b0> <<< 15494 1726853333.59108: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 15494 1726853333.59157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 15494 1726853333.59191: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 15494 1726853333.59204: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 15494 1726853333.59242: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b09fd72c0> <<< 15494 1726853333.59267: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b09fd6b70> <<< 15494 1726853333.59306: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b09fd6d50> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b09fd5fa0> <<< 15494 1726853333.59309: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 15494 1726853333.59431: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 15494 1726853333.59458: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b09fd7410> <<< 15494 1726853333.59461: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 15494 1726853333.59513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 15494 1726853333.59517: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a031f10> <<< 15494 1726853333.59548: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b09fd7ef0> <<< 15494 1726853333.59581: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a3b4500> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 15494 1726853333.59607: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15494 1726853333.59650: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 15494 1726853333.59699: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.59746: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 15494 1726853333.59768: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.59817: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.59976: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 15494 1726853333.60024: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 15494 1726853333.60281: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # <<< 15494 1726853333.60283: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available <<< 15494 1726853333.60321: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.60354: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.60391: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 15494 1726853333.60466: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 15494 1726853333.60886: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.61329: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 15494 1726853333.61493: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.61548: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 15494 1726853333.61575: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 15494 1726853333.61634: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.61695: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 15494 1726853333.61744: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.61764: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 15494 1726853333.61852: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.61905: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 15494 1726853333.61938: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.62246: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a033890> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a032870> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 15494 1726853333.62272: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.62340: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 15494 1726853333.62441: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.62533: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 15494 1726853333.62545: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.62592: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.62663: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 15494 1726853333.62682: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.62721: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.62760: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 15494 1726853333.62813: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15494 1726853333.62885: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853333.62941: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a072090> <<< 15494 1726853333.63134: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a061df0> import 'ansible.module_utils.facts.system.python' # <<< 15494 1726853333.63148: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.63202: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.63258: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 15494 1726853333.63348: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.63427: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.63542: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.63686: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 15494 1726853333.63702: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.63740: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.63777: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 15494 1726853333.63792: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.63825: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.63872: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 15494 1726853333.63915: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853333.63945: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a085c40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a063170> import 'ansible.module_utils.facts.system.user' # <<< 15494 1726853333.63986: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 15494 1726853333.63989: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.64025: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.64081: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 15494 1726853333.64085: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.64221: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.64378: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 15494 1726853333.64485: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.64581: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.64624: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.64669: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 15494 1726853333.64679: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 15494 1726853333.64766: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15494 1726853333.64803: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.64902: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.65012: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 15494 1726853333.65309: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.65313: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 15494 1726853333.65328: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.65874: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.66372: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 15494 1726853333.66411: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 15494 1726853333.66491: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.66637: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 15494 1726853333.66690: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.66795: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 15494 1726853333.66961: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.67111: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 15494 1726853333.67185: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 15494 1726853333.67196: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15494 1726853333.67220: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 15494 1726853333.67241: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.67419: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.67499: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.67623: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.67827: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 15494 1726853333.67852: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.67877: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.67911: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 15494 1726853333.67932: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.67964: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 15494 1726853333.67988: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.68043: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.68114: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 15494 1726853333.68143: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.68178: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 15494 1726853333.68195: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.68233: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.68293: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 15494 1726853333.68354: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.68425: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 15494 1726853333.68428: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.68678: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.68932: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 15494 1726853333.68943: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.69002: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.69064: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 15494 1726853333.69067: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.69100: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.69137: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 15494 1726853333.69174: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.69218: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 15494 1726853333.69221: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.69282: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.69289: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 15494 1726853333.69317: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.69375: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.69483: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 15494 1726853333.69488: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15494 1726853333.69573: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 15494 1726853333.69613: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 15494 1726853333.69667: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.69680: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.69782: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.70011: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.70021: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 15494 1726853333.70173: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.70358: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 15494 1726853333.70381: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.70443: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.70468: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 15494 1726853333.70548: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.70576: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 15494 1726853333.70647: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.70763: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 15494 1726853333.70824: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853333.71013: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available <<< 15494 1726853333.71577: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 15494 1726853333.71606: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 15494 1726853333.71664: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b09e82450> <<< 15494 1726853333.71689: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b09e80890> <<< 15494 1726853333.71736: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b09e7bc80> <<< 15494 1726853333.72534: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1Vv<<< 15494 1726853333.72538: stdout chunk (state=3): >>>iMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "28", "second": "53", "epoch": "1726853333", "epoch_int": "1726853333", "date": "2024-09-20", "time": "13:28:53", "iso8601_micro": "2024-09-20T17:28:53.723463Z", "iso8601": "2024-09-20T17:28:53Z", "iso8601_basic": "20240920T132853723463", "iso8601_basic_short": "20240920T132853", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15494 1726853333.73050: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 15494 1726853333.73083: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 <<< 15494 1726853333.73142: stdout chunk (state=3): >>># clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword <<< 15494 1726853333.73162: stdout chunk (state=3): >>># cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible <<< 15494 1726853333.73202: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text <<< 15494 1726853333.73252: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info <<< 15494 1726853333.73256: stdout chunk (state=3): >>># destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue <<< 15494 1726853333.73278: stdout chunk (state=3): >>># cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local <<< 15494 1726853333.73325: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd <<< 15494 1726853333.73348: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 15494 1726853333.73678: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15494 1726853333.73723: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 15494 1726853333.73756: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 15494 1726853333.73779: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 15494 1726853333.73830: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 15494 1726853333.73833: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 15494 1726853333.73880: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 15494 1726853333.73922: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 15494 1726853333.73973: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 15494 1726853333.74002: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 15494 1726853333.74052: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 15494 1726853333.74094: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 15494 1726853333.74127: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 15494 1726853333.74130: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 15494 1726853333.74195: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 15494 1726853333.74278: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 15494 1726853333.74310: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 15494 1726853333.74325: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15494 1726853333.74466: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 15494 1726853333.74507: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 15494 1726853333.74559: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 15494 1726853333.74592: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 15494 1726853333.74614: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15494 1726853333.74725: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit <<< 15494 1726853333.74777: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 15494 1726853333.74876: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 15494 1726853333.75412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853333.75416: stdout chunk (state=3): >>><<< 15494 1726853333.75418: stderr chunk (state=3): >>><<< 15494 1726853333.75889: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0b0184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0afe7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0b01aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0adc9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0adc9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae07e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae07f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae3f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae3ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae1fb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae1d280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae05040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae5f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae5e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae1e150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae5cc80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae94890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae042c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0ae94d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae94bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0ae94fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae02de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae956d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae953a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae965d0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0aeac7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0aeadeb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0aeaed50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0aeaf380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0aeae2a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0aeafe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0aeaf530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae96570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0aba7ce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0abd0740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0abd04a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0abd0770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0abd10a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0abd1a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0abd0950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0aba5e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0abd2e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0abd18e0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ae96cc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0abfb170> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ac1f4d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ac802f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ac82a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ac803e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ac452e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a5293d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0ac1e300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0abd3d40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f2b0ac1e660> # zipimport: found 103 names in '/tmp/ansible_setup_payload_p8n5zfj0/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a5930e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a571fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a571160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a591760> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a5c2a80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a5c2810> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a5c2120> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a5c2b40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0b01a9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a5c37a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a5c3980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a5c3ec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a42dc40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a42f830> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a430230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a4313d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a433ec0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0ae02ed0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a432180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a43bda0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a43a870> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a43a5d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a43ab40> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a432690> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a47fa40> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a4801d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a481c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a481a00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a484110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a4822d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a487860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a484230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a488650> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a488680> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a488b90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a480380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a3141d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a315460> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a48a960> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a48bd10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a48a600> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a319610> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a31a480> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a315580> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a31a4e0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a31b620> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a326120> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a323e00> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a40ea50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a5ee720> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a326240> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a488c80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a3b6120> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b09fcff50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b09fd4440> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a39eff0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a3b6c90> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a3b4800> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a3b43b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b09fd72c0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b09fd6b70> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b09fd6d50> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b09fd5fa0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b09fd7410> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a031f10> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b09fd7ef0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a3b4500> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a033890> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a032870> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a072090> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a061df0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b0a085c40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b0a063170> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2b09e82450> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b09e80890> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2b09e7bc80> {"ansible_facts": {"ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "28", "second": "53", "epoch": "1726853333", "epoch_int": "1726853333", "date": "2024-09-20", "time": "13:28:53", "iso8601_micro": "2024-09-20T17:28:53.723463Z", "iso8601": "2024-09-20T17:28:53Z", "iso8601_basic": "20240920T132853723463", "iso8601_basic_short": "20240920T132853", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15494 1726853333.76959: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853333.2317913-15617-243866696185059/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853333.76962: _low_level_execute_command(): starting 15494 1726853333.76965: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853333.2317913-15617-243866696185059/ > /dev/null 2>&1 && sleep 0' 15494 1726853333.77259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853333.77265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853333.77292: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 15494 1726853333.77297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853333.77364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853333.77367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853333.77377: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853333.77420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853333.77479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853333.79677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853333.79681: stdout chunk (state=3): >>><<< 15494 1726853333.79684: stderr chunk (state=3): >>><<< 15494 1726853333.79688: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853333.79690: handler run complete 15494 1726853333.79693: variable 'ansible_facts' from source: unknown 15494 1726853333.79697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853333.79718: variable 'ansible_facts' from source: unknown 15494 1726853333.79761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853333.79883: attempt loop complete, returning result 15494 1726853333.79886: _execute() done 15494 1726853333.79889: dumping result to json 15494 1726853333.79986: done dumping result, returning 15494 1726853333.79994: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [02083763-bbaf-0028-1a50-00000000008f] 15494 1726853333.79998: sending task result for task 02083763-bbaf-0028-1a50-00000000008f 15494 1726853333.80367: done sending task result for task 02083763-bbaf-0028-1a50-00000000008f 15494 1726853333.80370: WORKER PROCESS EXITING ok: [managed_node1] 15494 1726853333.80599: no more pending results, returning what we have 15494 1726853333.80602: results queue empty 15494 1726853333.80603: checking for any_errors_fatal 15494 1726853333.80606: done checking for any_errors_fatal 15494 1726853333.80606: checking for max_fail_percentage 15494 1726853333.80675: done checking for max_fail_percentage 15494 1726853333.80677: checking to see if all hosts have failed and the running result is not ok 15494 1726853333.80678: done checking to see if all hosts have failed 15494 1726853333.80680: getting the remaining hosts for this loop 15494 1726853333.80682: done getting the remaining hosts for this loop 15494 1726853333.80685: getting the next task for host managed_node1 15494 1726853333.80694: done getting next task for host managed_node1 15494 1726853333.80696: ^ task is: TASK: Check if system is ostree 15494 1726853333.80699: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853333.80702: getting variables 15494 1726853333.80704: in VariableManager get_vars() 15494 1726853333.80745: Calling all_inventory to load vars for managed_node1 15494 1726853333.80749: Calling groups_inventory to load vars for managed_node1 15494 1726853333.80752: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853333.80768: Calling all_plugins_play to load vars for managed_node1 15494 1726853333.80772: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853333.80775: Calling groups_plugins_play to load vars for managed_node1 15494 1726853333.80990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853333.81288: done with get_vars() 15494 1726853333.81299: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:28:53 -0400 (0:00:00.719) 0:00:02.430 ****** 15494 1726853333.81434: entering _queue_task() for managed_node1/stat 15494 1726853333.81755: worker is 1 (out of 1 available) 15494 1726853333.81767: exiting _queue_task() for managed_node1/stat 15494 1726853333.81997: done queuing things up, now waiting for results queue to drain 15494 1726853333.81999: waiting for pending results... 15494 1726853333.82206: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 15494 1726853333.82211: in run() - task 02083763-bbaf-0028-1a50-000000000091 15494 1726853333.82220: variable 'ansible_search_path' from source: unknown 15494 1726853333.82225: variable 'ansible_search_path' from source: unknown 15494 1726853333.82228: calling self._execute() 15494 1726853333.82320: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853333.82341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853333.82358: variable 'omit' from source: magic vars 15494 1726853333.83029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853333.83357: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853333.83433: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853333.83487: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853333.83567: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853333.83736: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853333.83762: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853333.83800: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853333.83850: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853333.84087: Evaluated conditional (not __network_is_ostree is defined): True 15494 1726853333.84090: variable 'omit' from source: magic vars 15494 1726853333.84115: variable 'omit' from source: magic vars 15494 1726853333.84157: variable 'omit' from source: magic vars 15494 1726853333.84201: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853333.84233: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853333.84256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853333.84286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853333.84376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853333.84386: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853333.84389: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853333.84391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853333.84465: Set connection var ansible_connection to ssh 15494 1726853333.84483: Set connection var ansible_pipelining to False 15494 1726853333.84504: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853333.84519: Set connection var ansible_shell_type to sh 15494 1726853333.84529: Set connection var ansible_timeout to 10 15494 1726853333.84539: Set connection var ansible_shell_executable to /bin/sh 15494 1726853333.84568: variable 'ansible_shell_executable' from source: unknown 15494 1726853333.84579: variable 'ansible_connection' from source: unknown 15494 1726853333.84585: variable 'ansible_module_compression' from source: unknown 15494 1726853333.84591: variable 'ansible_shell_type' from source: unknown 15494 1726853333.84597: variable 'ansible_shell_executable' from source: unknown 15494 1726853333.84611: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853333.84714: variable 'ansible_pipelining' from source: unknown 15494 1726853333.84717: variable 'ansible_timeout' from source: unknown 15494 1726853333.84719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853333.84800: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853333.84819: variable 'omit' from source: magic vars 15494 1726853333.84876: starting attempt loop 15494 1726853333.84879: running the handler 15494 1726853333.84881: _low_level_execute_command(): starting 15494 1726853333.84884: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853333.85693: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853333.85779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853333.85815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853333.85895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853333.87630: stdout chunk (state=3): >>>/root <<< 15494 1726853333.87747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853333.87787: stdout chunk (state=3): >>><<< 15494 1726853333.87797: stderr chunk (state=3): >>><<< 15494 1726853333.87931: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853333.87941: _low_level_execute_command(): starting 15494 1726853333.87944: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853333.8782775-15643-1068509492019 `" && echo ansible-tmp-1726853333.8782775-15643-1068509492019="` echo /root/.ansible/tmp/ansible-tmp-1726853333.8782775-15643-1068509492019 `" ) && sleep 0' 15494 1726853333.88863: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853333.88867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853333.88869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853333.88895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853333.88917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853333.88996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853333.90953: stdout chunk (state=3): >>>ansible-tmp-1726853333.8782775-15643-1068509492019=/root/.ansible/tmp/ansible-tmp-1726853333.8782775-15643-1068509492019 <<< 15494 1726853333.91280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853333.91284: stdout chunk (state=3): >>><<< 15494 1726853333.91286: stderr chunk (state=3): >>><<< 15494 1726853333.91289: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853333.8782775-15643-1068509492019=/root/.ansible/tmp/ansible-tmp-1726853333.8782775-15643-1068509492019 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853333.91291: variable 'ansible_module_compression' from source: unknown 15494 1726853333.91293: ANSIBALLZ: Using lock for stat 15494 1726853333.91295: ANSIBALLZ: Acquiring lock 15494 1726853333.91297: ANSIBALLZ: Lock acquired: 140002372343360 15494 1726853333.91299: ANSIBALLZ: Creating module 15494 1726853334.06657: ANSIBALLZ: Writing module into payload 15494 1726853334.06765: ANSIBALLZ: Writing module 15494 1726853334.06775: ANSIBALLZ: Renaming module 15494 1726853334.06778: ANSIBALLZ: Done creating module 15494 1726853334.06794: variable 'ansible_facts' from source: unknown 15494 1726853334.06898: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853333.8782775-15643-1068509492019/AnsiballZ_stat.py 15494 1726853334.07079: Sending initial data 15494 1726853334.07082: Sent initial data (151 bytes) 15494 1726853334.07934: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853334.07941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853334.08037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853334.09703: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15494 1726853334.09710: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15494 1726853334.09716: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 15494 1726853334.09724: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 15494 1726853334.09728: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853334.09789: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853334.09837: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpmi_ob4xf /root/.ansible/tmp/ansible-tmp-1726853333.8782775-15643-1068509492019/AnsiballZ_stat.py <<< 15494 1726853334.09839: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853333.8782775-15643-1068509492019/AnsiballZ_stat.py" <<< 15494 1726853334.09879: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpmi_ob4xf" to remote "/root/.ansible/tmp/ansible-tmp-1726853333.8782775-15643-1068509492019/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853333.8782775-15643-1068509492019/AnsiballZ_stat.py" <<< 15494 1726853334.10461: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853334.10501: stderr chunk (state=3): >>><<< 15494 1726853334.10507: stdout chunk (state=3): >>><<< 15494 1726853334.10537: done transferring module to remote 15494 1726853334.10572: _low_level_execute_command(): starting 15494 1726853334.10576: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853333.8782775-15643-1068509492019/ /root/.ansible/tmp/ansible-tmp-1726853333.8782775-15643-1068509492019/AnsiballZ_stat.py && sleep 0' 15494 1726853334.11837: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853334.11840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853334.11858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853334.11922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853334.13747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853334.13751: stderr chunk (state=3): >>><<< 15494 1726853334.13776: stdout chunk (state=3): >>><<< 15494 1726853334.13783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853334.13788: _low_level_execute_command(): starting 15494 1726853334.13895: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853333.8782775-15643-1068509492019/AnsiballZ_stat.py && sleep 0' 15494 1726853334.14843: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853334.14852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853334.14898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853334.14901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853334.15054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853334.15058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853334.15060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853334.15129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853334.15258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853334.17458: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15494 1726853334.17486: stdout chunk (state=3): >>>import _imp # builtin <<< 15494 1726853334.17516: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 15494 1726853334.17594: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 15494 1726853334.17631: stdout chunk (state=3): >>>import 'posix' # <<< 15494 1726853334.17665: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 15494 1726853334.17902: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea33104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea32dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea3312a50> <<< 15494 1726853334.17928: stdout chunk (state=3): >>>import '_signal' # <<< 15494 1726853334.17959: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 15494 1726853334.17976: stdout chunk (state=3): >>>import 'io' # <<< 15494 1726853334.18012: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 15494 1726853334.18097: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15494 1726853334.18121: stdout chunk (state=3): >>>import 'genericpath' # <<< 15494 1726853334.18131: stdout chunk (state=3): >>>import 'posixpath' # <<< 15494 1726853334.18157: stdout chunk (state=3): >>>import 'os' # <<< 15494 1726853334.18175: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 15494 1726853334.18198: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' <<< 15494 1726853334.18212: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 15494 1726853334.18233: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 15494 1726853334.18394: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 15494 1726853334.18501: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea30e5130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea30e5fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15494 1726853334.18637: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15494 1726853334.18656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 15494 1726853334.18674: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 15494 1726853334.18693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853334.18704: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15494 1726853334.18747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 15494 1726853334.18762: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15494 1726853334.18898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea3123ec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea3123f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15494 1726853334.18908: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15494 1726853334.18934: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15494 1726853334.18984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853334.19000: stdout chunk (state=3): >>>import 'itertools' # <<< 15494 1726853334.19027: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea315b830> <<< 15494 1726853334.19055: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 15494 1726853334.19069: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea315bec0> <<< 15494 1726853334.19084: stdout chunk (state=3): >>>import '_collections' # <<< 15494 1726853334.19297: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea313bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31392b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea3121070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15494 1726853334.19306: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 15494 1726853334.19320: stdout chunk (state=3): >>>import '_sre' # <<< 15494 1726853334.19342: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15494 1726853334.19478: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15494 1726853334.19483: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 15494 1726853334.19486: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 15494 1726853334.19602: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea317b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea317a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea313a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea3178bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31b0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31202f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853334.19618: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea31b0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31b0bf0> <<< 15494 1726853334.19652: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853334.19667: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea31b0fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea311ee10> <<< 15494 1726853334.19694: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 15494 1726853334.19712: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853334.19722: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 15494 1726853334.19757: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 15494 1726853334.19777: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31b1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31b1370> <<< 15494 1726853334.19787: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 15494 1726853334.19809: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 15494 1726853334.19995: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31b2540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31c8740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea31c9e20> <<< 15494 1726853334.20022: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 15494 1726853334.20027: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 15494 1726853334.20134: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 15494 1726853334.20148: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 15494 1726853334.20151: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31cacc0> <<< 15494 1726853334.20153: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853334.20155: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea31cb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31ca210> <<< 15494 1726853334.20157: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 15494 1726853334.20159: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15494 1726853334.20190: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853334.20208: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea31cbd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31cb4a0> <<< 15494 1726853334.20458: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31b24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2f5bc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 15494 1726853334.20461: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2f84710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2f84470> <<< 15494 1726853334.20469: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2f84740> <<< 15494 1726853334.20491: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 15494 1726853334.20699: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2f85070> <<< 15494 1726853334.20819: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2f85a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2f84920> <<< 15494 1726853334.20844: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2f59df0> <<< 15494 1726853334.20867: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15494 1726853334.20904: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15494 1726853334.20912: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 15494 1726853334.20925: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 15494 1726853334.21001: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2f86e10> <<< 15494 1726853334.21005: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2f85b50> <<< 15494 1726853334.21007: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31b2c60> <<< 15494 1726853334.21017: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15494 1726853334.21066: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853334.21084: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15494 1726853334.21123: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15494 1726853334.21296: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2fab170> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2fd3500> <<< 15494 1726853334.21325: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15494 1726853334.21364: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15494 1726853334.21416: stdout chunk (state=3): >>>import 'ntpath' # <<< 15494 1726853334.21464: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea3034260> <<< 15494 1726853334.21467: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py<<< 15494 1726853334.21482: stdout chunk (state=3): >>> <<< 15494 1726853334.21548: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15494 1726853334.21556: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 15494 1726853334.21562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15494 1726853334.21654: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea30369c0> <<< 15494 1726853334.21727: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea3034380> <<< 15494 1726853334.21762: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2ff9250> <<< 15494 1726853334.21894: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2929340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2fd2330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2f87d70> <<< 15494 1726853334.21930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15494 1726853334.21948: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcea29295e0> <<< 15494 1726853334.22119: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_48a7759t/ansible_stat_payload.zip' <<< 15494 1726853334.22134: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.22261: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.22391: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15494 1726853334.22414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15494 1726853334.22445: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea297f0b0> <<< 15494 1726853334.22457: stdout chunk (state=3): >>>import '_typing' # <<< 15494 1726853334.22709: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea295dfa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea295d130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available <<< 15494 1726853334.22754: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 15494 1726853334.24163: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.25319: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea297cf80> <<< 15494 1726853334.25690: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea29a6a20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea29a67b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea29a60c0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 15494 1726853334.25899: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea29a6510> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea297fd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea29a77d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea29a79e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea29a7ef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2811ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea28138c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea28142c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15494 1726853334.25919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 15494 1726853334.25940: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2815460> <<< 15494 1726853334.25954: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 15494 1726853334.25994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 15494 1726853334.26013: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15494 1726853334.26061: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2817ec0> <<< 15494 1726853334.26098: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea297c7d0> <<< 15494 1726853334.26131: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2816180> <<< 15494 1726853334.26179: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15494 1726853334.26695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea281ff20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea281e9f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea281e750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea281ecc0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2816690> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2867fb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea28682f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2869d60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2869b20> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15494 1726853334.26751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15494 1726853334.26799: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea286c290> <<< 15494 1726853334.26816: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea286a420> <<< 15494 1726853334.26898: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15494 1726853334.26901: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853334.26904: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 15494 1726853334.26906: stdout chunk (state=3): >>>import '_string' # <<< 15494 1726853334.26945: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea286fa10> <<< 15494 1726853334.27359: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea286c3e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2870830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2870890> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2870b90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea28684d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea28fc290> <<< 15494 1726853334.27473: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853334.27489: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea28fd250> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2872a20> <<< 15494 1726853334.27516: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 15494 1726853334.27538: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2873dd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2872630> # zipimport: zlib available <<< 15494 1726853334.27560: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 15494 1726853334.27594: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.27891: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 15494 1726853334.27921: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.28030: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.28569: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.29158: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 15494 1726853334.29305: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2701520> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2702240> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea28fd430> <<< 15494 1726853334.29357: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 15494 1726853334.29364: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.29394: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 15494 1726853334.29408: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.29553: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.29711: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 15494 1726853334.29877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2702390> # zipimport: zlib available <<< 15494 1726853334.30167: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.30604: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.30786: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.30802: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available <<< 15494 1726853334.30825: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 15494 1726853334.30852: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.30897: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.30984: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 15494 1726853334.31005: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15494 1726853334.31131: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 15494 1726853334.31193: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 15494 1726853334.31367: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.31569: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15494 1726853334.31699: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 15494 1726853334.31702: stdout chunk (state=3): >>>import '_ast' # <<< 15494 1726853334.31705: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2703530> <<< 15494 1726853334.31707: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.31768: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.31880: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 15494 1726853334.31884: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 15494 1726853334.32181: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.32184: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 15494 1726853334.32187: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.32192: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.32194: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15494 1726853334.32206: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853334.32296: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea270e090> <<< 15494 1726853334.32490: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2708e60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15494 1726853334.32518: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.32615: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15494 1726853334.32618: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15494 1726853334.32621: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 15494 1726853334.32624: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15494 1726853334.32829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 15494 1726853334.32832: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 15494 1726853334.32834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15494 1726853334.32837: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea29fa9f0> <<< 15494 1726853334.32839: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea29ee6c0> <<< 15494 1726853334.32927: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea270e1b0> <<< 15494 1726853334.32935: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea28fd490> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 15494 1726853334.32977: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15494 1726853334.33016: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 15494 1726853334.33049: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 15494 1726853334.33052: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.33207: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.33367: stdout chunk (state=3): >>># zipimport: zlib available <<< 15494 1726853334.33491: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 15494 1726853334.33823: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value <<< 15494 1726853334.33884: stdout chunk (state=3): >>># clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref <<< 15494 1726853334.33889: stdout chunk (state=3): >>># cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path <<< 15494 1726853334.33908: stdout chunk (state=3): >>># cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing <<< 15494 1726853334.33966: stdout chunk (state=3): >>># cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap <<< 15494 1726853334.33980: stdout chunk (state=3): >>># cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 15494 1726853334.34476: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch <<< 15494 1726853334.34480: stdout chunk (state=3): >>># destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 15494 1726853334.34487: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 15494 1726853334.34491: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex <<< 15494 1726853334.34496: stdout chunk (state=3): >>># destroy subprocess <<< 15494 1726853334.34500: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 15494 1726853334.34504: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing <<< 15494 1726853334.34598: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15494 1726853334.34723: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections <<< 15494 1726853334.34808: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 15494 1726853334.34825: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 15494 1726853334.34855: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15494 1726853334.34925: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 15494 1726853334.34990: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re <<< 15494 1726853334.35034: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 15494 1726853334.35067: stdout chunk (state=3): >>># clear sys.audit hooks <<< 15494 1726853334.35576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853334.35579: stdout chunk (state=3): >>><<< 15494 1726853334.35581: stderr chunk (state=3): >>><<< 15494 1726853334.35594: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea33104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea32dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea3312a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea30e5130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea30e5fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea3123ec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea3123f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea315b830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea315bec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea313bb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31392b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea3121070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea317b7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea317a3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea313a150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea3178bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31b0890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31202f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea31b0d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31b0bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea31b0fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea311ee10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31b1670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31b1370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31b2540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31c8740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea31c9e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31cacc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea31cb2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31ca210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea31cbd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31cb4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31b24b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2f5bc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2f84710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2f84470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2f84740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2f85070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2f85a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2f84920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2f59df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2f86e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2f85b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea31b2c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2fab170> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2fd3500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea3034260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea30369c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea3034380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2ff9250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2929340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2fd2330> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2f87d70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcea29295e0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_48a7759t/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea297f0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea295dfa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea295d130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea297cf80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea29a6a20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea29a67b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea29a60c0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea29a6510> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea297fd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea29a77d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea29a79e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea29a7ef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2811ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea28138c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea28142c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2815460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2817ec0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea297c7d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2816180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea281ff20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea281e9f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea281e750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea281ecc0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2816690> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2867fb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea28682f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2869d60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2869b20> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea286c290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea286a420> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea286fa10> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea286c3e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2870830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2870890> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2870b90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea28684d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea28fc290> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea28fd250> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2872a20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2873dd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2872630> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea2701520> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2702240> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea28fd430> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2702390> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2703530> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcea270e090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea2708e60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea29fa9f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea29ee6c0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea270e1b0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcea28fd490> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15494 1726853334.36548: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853333.8782775-15643-1068509492019/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853334.36553: _low_level_execute_command(): starting 15494 1726853334.36555: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853333.8782775-15643-1068509492019/ > /dev/null 2>&1 && sleep 0' 15494 1726853334.36558: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853334.36560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853334.36562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853334.36564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853334.36566: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853334.36568: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853334.36572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853334.36574: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853334.36576: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 15494 1726853334.36578: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15494 1726853334.36580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853334.36582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853334.36584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853334.36586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853334.36588: stderr chunk (state=3): >>>debug2: match found <<< 15494 1726853334.36590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853334.36592: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853334.36594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853334.36596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853334.36657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853334.38498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853334.38540: stderr chunk (state=3): >>><<< 15494 1726853334.38553: stdout chunk (state=3): >>><<< 15494 1726853334.38579: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853334.38596: handler run complete 15494 1726853334.38623: attempt loop complete, returning result 15494 1726853334.38632: _execute() done 15494 1726853334.38641: dumping result to json 15494 1726853334.38654: done dumping result, returning 15494 1726853334.38667: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [02083763-bbaf-0028-1a50-000000000091] 15494 1726853334.38678: sending task result for task 02083763-bbaf-0028-1a50-000000000091 ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15494 1726853334.38927: no more pending results, returning what we have 15494 1726853334.38930: results queue empty 15494 1726853334.38930: checking for any_errors_fatal 15494 1726853334.38938: done checking for any_errors_fatal 15494 1726853334.38938: checking for max_fail_percentage 15494 1726853334.38940: done checking for max_fail_percentage 15494 1726853334.38940: checking to see if all hosts have failed and the running result is not ok 15494 1726853334.38941: done checking to see if all hosts have failed 15494 1726853334.38942: getting the remaining hosts for this loop 15494 1726853334.38943: done getting the remaining hosts for this loop 15494 1726853334.38949: getting the next task for host managed_node1 15494 1726853334.38955: done getting next task for host managed_node1 15494 1726853334.38957: ^ task is: TASK: Set flag to indicate system is ostree 15494 1726853334.38960: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853334.38963: getting variables 15494 1726853334.38964: in VariableManager get_vars() 15494 1726853334.38993: Calling all_inventory to load vars for managed_node1 15494 1726853334.38996: Calling groups_inventory to load vars for managed_node1 15494 1726853334.38999: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853334.39009: Calling all_plugins_play to load vars for managed_node1 15494 1726853334.39012: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853334.39015: Calling groups_plugins_play to load vars for managed_node1 15494 1726853334.39443: done sending task result for task 02083763-bbaf-0028-1a50-000000000091 15494 1726853334.39449: WORKER PROCESS EXITING 15494 1726853334.39475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853334.39672: done with get_vars() 15494 1726853334.39683: done getting variables 15494 1726853334.39781: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:28:54 -0400 (0:00:00.583) 0:00:03.013 ****** 15494 1726853334.39815: entering _queue_task() for managed_node1/set_fact 15494 1726853334.39817: Creating lock for set_fact 15494 1726853334.40098: worker is 1 (out of 1 available) 15494 1726853334.40109: exiting _queue_task() for managed_node1/set_fact 15494 1726853334.40122: done queuing things up, now waiting for results queue to drain 15494 1726853334.40124: waiting for pending results... 15494 1726853334.40369: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 15494 1726853334.40465: in run() - task 02083763-bbaf-0028-1a50-000000000092 15494 1726853334.40492: variable 'ansible_search_path' from source: unknown 15494 1726853334.40500: variable 'ansible_search_path' from source: unknown 15494 1726853334.40542: calling self._execute() 15494 1726853334.40622: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853334.40634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853334.40647: variable 'omit' from source: magic vars 15494 1726853334.41104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853334.41412: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853334.41463: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853334.41503: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853334.41539: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853334.41777: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853334.41780: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853334.41783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853334.41785: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853334.41814: Evaluated conditional (not __network_is_ostree is defined): True 15494 1726853334.41825: variable 'omit' from source: magic vars 15494 1726853334.41863: variable 'omit' from source: magic vars 15494 1726853334.41974: variable '__ostree_booted_stat' from source: set_fact 15494 1726853334.42032: variable 'omit' from source: magic vars 15494 1726853334.42060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853334.42092: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853334.42124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853334.42143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853334.42158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853334.42190: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853334.42200: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853334.42207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853334.42301: Set connection var ansible_connection to ssh 15494 1726853334.42311: Set connection var ansible_pipelining to False 15494 1726853334.42320: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853334.42328: Set connection var ansible_shell_type to sh 15494 1726853334.42340: Set connection var ansible_timeout to 10 15494 1726853334.42351: Set connection var ansible_shell_executable to /bin/sh 15494 1726853334.42444: variable 'ansible_shell_executable' from source: unknown 15494 1726853334.42447: variable 'ansible_connection' from source: unknown 15494 1726853334.42449: variable 'ansible_module_compression' from source: unknown 15494 1726853334.42451: variable 'ansible_shell_type' from source: unknown 15494 1726853334.42453: variable 'ansible_shell_executable' from source: unknown 15494 1726853334.42455: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853334.42457: variable 'ansible_pipelining' from source: unknown 15494 1726853334.42459: variable 'ansible_timeout' from source: unknown 15494 1726853334.42461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853334.42608: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853334.42880: variable 'omit' from source: magic vars 15494 1726853334.42884: starting attempt loop 15494 1726853334.42886: running the handler 15494 1726853334.42888: handler run complete 15494 1726853334.42891: attempt loop complete, returning result 15494 1726853334.42893: _execute() done 15494 1726853334.42894: dumping result to json 15494 1726853334.42896: done dumping result, returning 15494 1726853334.42898: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [02083763-bbaf-0028-1a50-000000000092] 15494 1726853334.42900: sending task result for task 02083763-bbaf-0028-1a50-000000000092 15494 1726853334.42960: done sending task result for task 02083763-bbaf-0028-1a50-000000000092 15494 1726853334.42963: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 15494 1726853334.43016: no more pending results, returning what we have 15494 1726853334.43019: results queue empty 15494 1726853334.43020: checking for any_errors_fatal 15494 1726853334.43026: done checking for any_errors_fatal 15494 1726853334.43027: checking for max_fail_percentage 15494 1726853334.43029: done checking for max_fail_percentage 15494 1726853334.43030: checking to see if all hosts have failed and the running result is not ok 15494 1726853334.43030: done checking to see if all hosts have failed 15494 1726853334.43031: getting the remaining hosts for this loop 15494 1726853334.43033: done getting the remaining hosts for this loop 15494 1726853334.43037: getting the next task for host managed_node1 15494 1726853334.43045: done getting next task for host managed_node1 15494 1726853334.43047: ^ task is: TASK: Fix CentOS6 Base repo 15494 1726853334.43050: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853334.43054: getting variables 15494 1726853334.43055: in VariableManager get_vars() 15494 1726853334.43085: Calling all_inventory to load vars for managed_node1 15494 1726853334.43088: Calling groups_inventory to load vars for managed_node1 15494 1726853334.43091: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853334.43101: Calling all_plugins_play to load vars for managed_node1 15494 1726853334.43104: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853334.43113: Calling groups_plugins_play to load vars for managed_node1 15494 1726853334.43714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853334.44073: done with get_vars() 15494 1726853334.44081: done getting variables 15494 1726853334.44199: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:28:54 -0400 (0:00:00.044) 0:00:03.058 ****** 15494 1726853334.44229: entering _queue_task() for managed_node1/copy 15494 1726853334.44465: worker is 1 (out of 1 available) 15494 1726853334.44478: exiting _queue_task() for managed_node1/copy 15494 1726853334.44490: done queuing things up, now waiting for results queue to drain 15494 1726853334.44491: waiting for pending results... 15494 1726853334.44889: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 15494 1726853334.44894: in run() - task 02083763-bbaf-0028-1a50-000000000094 15494 1726853334.44897: variable 'ansible_search_path' from source: unknown 15494 1726853334.44899: variable 'ansible_search_path' from source: unknown 15494 1726853334.44902: calling self._execute() 15494 1726853334.44934: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853334.44944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853334.44958: variable 'omit' from source: magic vars 15494 1726853334.45398: variable 'ansible_distribution' from source: facts 15494 1726853334.45429: Evaluated conditional (ansible_distribution == 'CentOS'): True 15494 1726853334.45546: variable 'ansible_distribution_major_version' from source: facts 15494 1726853334.45558: Evaluated conditional (ansible_distribution_major_version == '6'): False 15494 1726853334.45565: when evaluation is False, skipping this task 15494 1726853334.45574: _execute() done 15494 1726853334.45582: dumping result to json 15494 1726853334.45590: done dumping result, returning 15494 1726853334.45601: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [02083763-bbaf-0028-1a50-000000000094] 15494 1726853334.45611: sending task result for task 02083763-bbaf-0028-1a50-000000000094 15494 1726853334.45710: done sending task result for task 02083763-bbaf-0028-1a50-000000000094 15494 1726853334.45716: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 15494 1726853334.45800: no more pending results, returning what we have 15494 1726853334.45804: results queue empty 15494 1726853334.45804: checking for any_errors_fatal 15494 1726853334.45807: done checking for any_errors_fatal 15494 1726853334.45808: checking for max_fail_percentage 15494 1726853334.45809: done checking for max_fail_percentage 15494 1726853334.45810: checking to see if all hosts have failed and the running result is not ok 15494 1726853334.45810: done checking to see if all hosts have failed 15494 1726853334.45811: getting the remaining hosts for this loop 15494 1726853334.45812: done getting the remaining hosts for this loop 15494 1726853334.45816: getting the next task for host managed_node1 15494 1726853334.45822: done getting next task for host managed_node1 15494 1726853334.45825: ^ task is: TASK: Include the task 'enable_epel.yml' 15494 1726853334.45827: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853334.45831: getting variables 15494 1726853334.45832: in VariableManager get_vars() 15494 1726853334.45858: Calling all_inventory to load vars for managed_node1 15494 1726853334.45861: Calling groups_inventory to load vars for managed_node1 15494 1726853334.45864: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853334.45877: Calling all_plugins_play to load vars for managed_node1 15494 1726853334.45880: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853334.45882: Calling groups_plugins_play to load vars for managed_node1 15494 1726853334.46244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853334.46429: done with get_vars() 15494 1726853334.46439: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:28:54 -0400 (0:00:00.022) 0:00:03.081 ****** 15494 1726853334.46527: entering _queue_task() for managed_node1/include_tasks 15494 1726853334.46747: worker is 1 (out of 1 available) 15494 1726853334.46759: exiting _queue_task() for managed_node1/include_tasks 15494 1726853334.46769: done queuing things up, now waiting for results queue to drain 15494 1726853334.46912: waiting for pending results... 15494 1726853334.47056: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 15494 1726853334.47154: in run() - task 02083763-bbaf-0028-1a50-000000000095 15494 1726853334.47479: variable 'ansible_search_path' from source: unknown 15494 1726853334.47482: variable 'ansible_search_path' from source: unknown 15494 1726853334.47485: calling self._execute() 15494 1726853334.47576: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853334.47580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853334.47585: variable 'omit' from source: magic vars 15494 1726853334.48621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853334.50509: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853334.50596: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853334.50626: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853334.50656: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853334.50787: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853334.50808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853334.50847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853334.50881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853334.50925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853334.50945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853334.51032: variable '__network_is_ostree' from source: set_fact 15494 1726853334.51054: Evaluated conditional (not __network_is_ostree | d(false)): True 15494 1726853334.51057: _execute() done 15494 1726853334.51061: dumping result to json 15494 1726853334.51064: done dumping result, returning 15494 1726853334.51072: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [02083763-bbaf-0028-1a50-000000000095] 15494 1726853334.51077: sending task result for task 02083763-bbaf-0028-1a50-000000000095 15494 1726853334.51165: done sending task result for task 02083763-bbaf-0028-1a50-000000000095 15494 1726853334.51167: WORKER PROCESS EXITING 15494 1726853334.51194: no more pending results, returning what we have 15494 1726853334.51199: in VariableManager get_vars() 15494 1726853334.51237: Calling all_inventory to load vars for managed_node1 15494 1726853334.51239: Calling groups_inventory to load vars for managed_node1 15494 1726853334.51243: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853334.51254: Calling all_plugins_play to load vars for managed_node1 15494 1726853334.51256: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853334.51259: Calling groups_plugins_play to load vars for managed_node1 15494 1726853334.51499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853334.51686: done with get_vars() 15494 1726853334.51695: variable 'ansible_search_path' from source: unknown 15494 1726853334.51704: variable 'ansible_search_path' from source: unknown 15494 1726853334.51744: we have included files to process 15494 1726853334.51746: generating all_blocks data 15494 1726853334.51747: done generating all_blocks data 15494 1726853334.51752: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15494 1726853334.51753: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15494 1726853334.51757: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15494 1726853334.52698: done processing included file 15494 1726853334.52700: iterating over new_blocks loaded from include file 15494 1726853334.52701: in VariableManager get_vars() 15494 1726853334.52713: done with get_vars() 15494 1726853334.52714: filtering new block on tags 15494 1726853334.52742: done filtering new block on tags 15494 1726853334.52745: in VariableManager get_vars() 15494 1726853334.52757: done with get_vars() 15494 1726853334.52759: filtering new block on tags 15494 1726853334.52776: done filtering new block on tags 15494 1726853334.52778: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 15494 1726853334.52791: extending task lists for all hosts with included blocks 15494 1726853334.52901: done extending task lists 15494 1726853334.52903: done processing included files 15494 1726853334.52904: results queue empty 15494 1726853334.52905: checking for any_errors_fatal 15494 1726853334.52907: done checking for any_errors_fatal 15494 1726853334.52907: checking for max_fail_percentage 15494 1726853334.52909: done checking for max_fail_percentage 15494 1726853334.52909: checking to see if all hosts have failed and the running result is not ok 15494 1726853334.52910: done checking to see if all hosts have failed 15494 1726853334.52911: getting the remaining hosts for this loop 15494 1726853334.52912: done getting the remaining hosts for this loop 15494 1726853334.52914: getting the next task for host managed_node1 15494 1726853334.52920: done getting next task for host managed_node1 15494 1726853334.52922: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 15494 1726853334.52925: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853334.52927: getting variables 15494 1726853334.52928: in VariableManager get_vars() 15494 1726853334.52937: Calling all_inventory to load vars for managed_node1 15494 1726853334.52939: Calling groups_inventory to load vars for managed_node1 15494 1726853334.52941: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853334.52949: Calling all_plugins_play to load vars for managed_node1 15494 1726853334.52957: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853334.52961: Calling groups_plugins_play to load vars for managed_node1 15494 1726853334.53130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853334.53328: done with get_vars() 15494 1726853334.53336: done getting variables 15494 1726853334.53398: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 15494 1726853334.53585: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:28:54 -0400 (0:00:00.070) 0:00:03.152 ****** 15494 1726853334.53628: entering _queue_task() for managed_node1/command 15494 1726853334.53630: Creating lock for command 15494 1726853334.54003: worker is 1 (out of 1 available) 15494 1726853334.54020: exiting _queue_task() for managed_node1/command 15494 1726853334.54031: done queuing things up, now waiting for results queue to drain 15494 1726853334.54033: waiting for pending results... 15494 1726853334.54491: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 15494 1726853334.54495: in run() - task 02083763-bbaf-0028-1a50-0000000000af 15494 1726853334.54498: variable 'ansible_search_path' from source: unknown 15494 1726853334.54502: variable 'ansible_search_path' from source: unknown 15494 1726853334.54579: calling self._execute() 15494 1726853334.54698: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853334.54708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853334.54731: variable 'omit' from source: magic vars 15494 1726853334.55263: variable 'ansible_distribution' from source: facts 15494 1726853334.55285: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15494 1726853334.55445: variable 'ansible_distribution_major_version' from source: facts 15494 1726853334.55460: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15494 1726853334.55467: when evaluation is False, skipping this task 15494 1726853334.55477: _execute() done 15494 1726853334.55496: dumping result to json 15494 1726853334.55499: done dumping result, returning 15494 1726853334.55531: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [02083763-bbaf-0028-1a50-0000000000af] 15494 1726853334.55534: sending task result for task 02083763-bbaf-0028-1a50-0000000000af skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15494 1726853334.55830: no more pending results, returning what we have 15494 1726853334.55833: results queue empty 15494 1726853334.55834: checking for any_errors_fatal 15494 1726853334.55835: done checking for any_errors_fatal 15494 1726853334.55836: checking for max_fail_percentage 15494 1726853334.55838: done checking for max_fail_percentage 15494 1726853334.55838: checking to see if all hosts have failed and the running result is not ok 15494 1726853334.55839: done checking to see if all hosts have failed 15494 1726853334.55840: getting the remaining hosts for this loop 15494 1726853334.55841: done getting the remaining hosts for this loop 15494 1726853334.55844: getting the next task for host managed_node1 15494 1726853334.55857: done getting next task for host managed_node1 15494 1726853334.55860: ^ task is: TASK: Install yum-utils package 15494 1726853334.55863: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853334.55866: getting variables 15494 1726853334.55868: in VariableManager get_vars() 15494 1726853334.55897: Calling all_inventory to load vars for managed_node1 15494 1726853334.55901: Calling groups_inventory to load vars for managed_node1 15494 1726853334.55904: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853334.55915: Calling all_plugins_play to load vars for managed_node1 15494 1726853334.55919: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853334.55922: Calling groups_plugins_play to load vars for managed_node1 15494 1726853334.56218: done sending task result for task 02083763-bbaf-0028-1a50-0000000000af 15494 1726853334.56221: WORKER PROCESS EXITING 15494 1726853334.56231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853334.56348: done with get_vars() 15494 1726853334.56354: done getting variables 15494 1726853334.56432: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:28:54 -0400 (0:00:00.028) 0:00:03.180 ****** 15494 1726853334.56451: entering _queue_task() for managed_node1/package 15494 1726853334.56453: Creating lock for package 15494 1726853334.56625: worker is 1 (out of 1 available) 15494 1726853334.56638: exiting _queue_task() for managed_node1/package 15494 1726853334.56649: done queuing things up, now waiting for results queue to drain 15494 1726853334.56651: waiting for pending results... 15494 1726853334.56786: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 15494 1726853334.56857: in run() - task 02083763-bbaf-0028-1a50-0000000000b0 15494 1726853334.56865: variable 'ansible_search_path' from source: unknown 15494 1726853334.56869: variable 'ansible_search_path' from source: unknown 15494 1726853334.56903: calling self._execute() 15494 1726853334.56956: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853334.56960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853334.56968: variable 'omit' from source: magic vars 15494 1726853334.57232: variable 'ansible_distribution' from source: facts 15494 1726853334.57242: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15494 1726853334.57326: variable 'ansible_distribution_major_version' from source: facts 15494 1726853334.57330: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15494 1726853334.57333: when evaluation is False, skipping this task 15494 1726853334.57336: _execute() done 15494 1726853334.57338: dumping result to json 15494 1726853334.57341: done dumping result, returning 15494 1726853334.57351: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [02083763-bbaf-0028-1a50-0000000000b0] 15494 1726853334.57354: sending task result for task 02083763-bbaf-0028-1a50-0000000000b0 15494 1726853334.57433: done sending task result for task 02083763-bbaf-0028-1a50-0000000000b0 15494 1726853334.57436: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15494 1726853334.57555: no more pending results, returning what we have 15494 1726853334.57558: results queue empty 15494 1726853334.57559: checking for any_errors_fatal 15494 1726853334.57565: done checking for any_errors_fatal 15494 1726853334.57565: checking for max_fail_percentage 15494 1726853334.57567: done checking for max_fail_percentage 15494 1726853334.57567: checking to see if all hosts have failed and the running result is not ok 15494 1726853334.57568: done checking to see if all hosts have failed 15494 1726853334.57569: getting the remaining hosts for this loop 15494 1726853334.57570: done getting the remaining hosts for this loop 15494 1726853334.57575: getting the next task for host managed_node1 15494 1726853334.57581: done getting next task for host managed_node1 15494 1726853334.57583: ^ task is: TASK: Enable EPEL 7 15494 1726853334.57586: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853334.57589: getting variables 15494 1726853334.57590: in VariableManager get_vars() 15494 1726853334.57611: Calling all_inventory to load vars for managed_node1 15494 1726853334.57613: Calling groups_inventory to load vars for managed_node1 15494 1726853334.57616: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853334.57624: Calling all_plugins_play to load vars for managed_node1 15494 1726853334.57627: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853334.57630: Calling groups_plugins_play to load vars for managed_node1 15494 1726853334.57789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853334.57992: done with get_vars() 15494 1726853334.58000: done getting variables 15494 1726853334.58075: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:28:54 -0400 (0:00:00.016) 0:00:03.196 ****** 15494 1726853334.58118: entering _queue_task() for managed_node1/command 15494 1726853334.58309: worker is 1 (out of 1 available) 15494 1726853334.58318: exiting _queue_task() for managed_node1/command 15494 1726853334.58328: done queuing things up, now waiting for results queue to drain 15494 1726853334.58329: waiting for pending results... 15494 1726853334.58587: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 15494 1726853334.58635: in run() - task 02083763-bbaf-0028-1a50-0000000000b1 15494 1726853334.58650: variable 'ansible_search_path' from source: unknown 15494 1726853334.58657: variable 'ansible_search_path' from source: unknown 15494 1726853334.58700: calling self._execute() 15494 1726853334.58976: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853334.58980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853334.58982: variable 'omit' from source: magic vars 15494 1726853334.59185: variable 'ansible_distribution' from source: facts 15494 1726853334.59203: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15494 1726853334.59336: variable 'ansible_distribution_major_version' from source: facts 15494 1726853334.59347: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15494 1726853334.59359: when evaluation is False, skipping this task 15494 1726853334.59366: _execute() done 15494 1726853334.59374: dumping result to json 15494 1726853334.59382: done dumping result, returning 15494 1726853334.59391: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [02083763-bbaf-0028-1a50-0000000000b1] 15494 1726853334.59418: sending task result for task 02083763-bbaf-0028-1a50-0000000000b1 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15494 1726853334.59623: no more pending results, returning what we have 15494 1726853334.59626: results queue empty 15494 1726853334.59627: checking for any_errors_fatal 15494 1726853334.59633: done checking for any_errors_fatal 15494 1726853334.59634: checking for max_fail_percentage 15494 1726853334.59636: done checking for max_fail_percentage 15494 1726853334.59636: checking to see if all hosts have failed and the running result is not ok 15494 1726853334.59637: done checking to see if all hosts have failed 15494 1726853334.59638: getting the remaining hosts for this loop 15494 1726853334.59640: done getting the remaining hosts for this loop 15494 1726853334.59643: getting the next task for host managed_node1 15494 1726853334.59651: done getting next task for host managed_node1 15494 1726853334.59654: ^ task is: TASK: Enable EPEL 8 15494 1726853334.59657: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853334.59661: getting variables 15494 1726853334.59663: in VariableManager get_vars() 15494 1726853334.59693: Calling all_inventory to load vars for managed_node1 15494 1726853334.59696: Calling groups_inventory to load vars for managed_node1 15494 1726853334.59699: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853334.59711: Calling all_plugins_play to load vars for managed_node1 15494 1726853334.59714: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853334.59717: Calling groups_plugins_play to load vars for managed_node1 15494 1726853334.60016: done sending task result for task 02083763-bbaf-0028-1a50-0000000000b1 15494 1726853334.60019: WORKER PROCESS EXITING 15494 1726853334.60291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853334.60493: done with get_vars() 15494 1726853334.60500: done getting variables 15494 1726853334.60546: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:28:54 -0400 (0:00:00.024) 0:00:03.221 ****** 15494 1726853334.60779: entering _queue_task() for managed_node1/command 15494 1726853334.61086: worker is 1 (out of 1 available) 15494 1726853334.61097: exiting _queue_task() for managed_node1/command 15494 1726853334.61106: done queuing things up, now waiting for results queue to drain 15494 1726853334.61108: waiting for pending results... 15494 1726853334.61323: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 15494 1726853334.61431: in run() - task 02083763-bbaf-0028-1a50-0000000000b2 15494 1726853334.61454: variable 'ansible_search_path' from source: unknown 15494 1726853334.61462: variable 'ansible_search_path' from source: unknown 15494 1726853334.61502: calling self._execute() 15494 1726853334.61580: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853334.61591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853334.61657: variable 'omit' from source: magic vars 15494 1726853334.61952: variable 'ansible_distribution' from source: facts 15494 1726853334.61969: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15494 1726853334.62099: variable 'ansible_distribution_major_version' from source: facts 15494 1726853334.62176: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15494 1726853334.62180: when evaluation is False, skipping this task 15494 1726853334.62182: _execute() done 15494 1726853334.62184: dumping result to json 15494 1726853334.62187: done dumping result, returning 15494 1726853334.62191: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [02083763-bbaf-0028-1a50-0000000000b2] 15494 1726853334.62194: sending task result for task 02083763-bbaf-0028-1a50-0000000000b2 15494 1726853334.62253: done sending task result for task 02083763-bbaf-0028-1a50-0000000000b2 15494 1726853334.62256: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15494 1726853334.62304: no more pending results, returning what we have 15494 1726853334.62308: results queue empty 15494 1726853334.62309: checking for any_errors_fatal 15494 1726853334.62313: done checking for any_errors_fatal 15494 1726853334.62314: checking for max_fail_percentage 15494 1726853334.62316: done checking for max_fail_percentage 15494 1726853334.62317: checking to see if all hosts have failed and the running result is not ok 15494 1726853334.62318: done checking to see if all hosts have failed 15494 1726853334.62318: getting the remaining hosts for this loop 15494 1726853334.62320: done getting the remaining hosts for this loop 15494 1726853334.62323: getting the next task for host managed_node1 15494 1726853334.62332: done getting next task for host managed_node1 15494 1726853334.62334: ^ task is: TASK: Enable EPEL 6 15494 1726853334.62338: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853334.62341: getting variables 15494 1726853334.62343: in VariableManager get_vars() 15494 1726853334.62370: Calling all_inventory to load vars for managed_node1 15494 1726853334.62527: Calling groups_inventory to load vars for managed_node1 15494 1726853334.62530: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853334.62539: Calling all_plugins_play to load vars for managed_node1 15494 1726853334.62542: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853334.62545: Calling groups_plugins_play to load vars for managed_node1 15494 1726853334.62712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853334.62901: done with get_vars() 15494 1726853334.62910: done getting variables 15494 1726853334.62965: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:28:54 -0400 (0:00:00.024) 0:00:03.245 ****** 15494 1726853334.62995: entering _queue_task() for managed_node1/copy 15494 1726853334.63211: worker is 1 (out of 1 available) 15494 1726853334.63221: exiting _queue_task() for managed_node1/copy 15494 1726853334.63232: done queuing things up, now waiting for results queue to drain 15494 1726853334.63234: waiting for pending results... 15494 1726853334.63613: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 15494 1726853334.63735: in run() - task 02083763-bbaf-0028-1a50-0000000000b4 15494 1726853334.63755: variable 'ansible_search_path' from source: unknown 15494 1726853334.63818: variable 'ansible_search_path' from source: unknown 15494 1726853334.63822: calling self._execute() 15494 1726853334.63881: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853334.63893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853334.63906: variable 'omit' from source: magic vars 15494 1726853334.64325: variable 'ansible_distribution' from source: facts 15494 1726853334.64341: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15494 1726853334.64452: variable 'ansible_distribution_major_version' from source: facts 15494 1726853334.64463: Evaluated conditional (ansible_distribution_major_version == '6'): False 15494 1726853334.64475: when evaluation is False, skipping this task 15494 1726853334.64483: _execute() done 15494 1726853334.64637: dumping result to json 15494 1726853334.64640: done dumping result, returning 15494 1726853334.64643: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [02083763-bbaf-0028-1a50-0000000000b4] 15494 1726853334.64645: sending task result for task 02083763-bbaf-0028-1a50-0000000000b4 15494 1726853334.64706: done sending task result for task 02083763-bbaf-0028-1a50-0000000000b4 15494 1726853334.64709: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 15494 1726853334.64749: no more pending results, returning what we have 15494 1726853334.64752: results queue empty 15494 1726853334.64753: checking for any_errors_fatal 15494 1726853334.64759: done checking for any_errors_fatal 15494 1726853334.64760: checking for max_fail_percentage 15494 1726853334.64761: done checking for max_fail_percentage 15494 1726853334.64762: checking to see if all hosts have failed and the running result is not ok 15494 1726853334.64763: done checking to see if all hosts have failed 15494 1726853334.64764: getting the remaining hosts for this loop 15494 1726853334.64765: done getting the remaining hosts for this loop 15494 1726853334.64769: getting the next task for host managed_node1 15494 1726853334.64778: done getting next task for host managed_node1 15494 1726853334.64781: ^ task is: TASK: Set network provider to 'nm' 15494 1726853334.64783: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853334.64787: getting variables 15494 1726853334.64789: in VariableManager get_vars() 15494 1726853334.64817: Calling all_inventory to load vars for managed_node1 15494 1726853334.64819: Calling groups_inventory to load vars for managed_node1 15494 1726853334.64822: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853334.64832: Calling all_plugins_play to load vars for managed_node1 15494 1726853334.64835: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853334.64837: Calling groups_plugins_play to load vars for managed_node1 15494 1726853334.65157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853334.65334: done with get_vars() 15494 1726853334.65342: done getting variables 15494 1726853334.65397: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:13 Friday 20 September 2024 13:28:54 -0400 (0:00:00.024) 0:00:03.270 ****** 15494 1726853334.65422: entering _queue_task() for managed_node1/set_fact 15494 1726853334.65622: worker is 1 (out of 1 available) 15494 1726853334.65634: exiting _queue_task() for managed_node1/set_fact 15494 1726853334.65644: done queuing things up, now waiting for results queue to drain 15494 1726853334.65646: waiting for pending results... 15494 1726853334.66085: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 15494 1726853334.66091: in run() - task 02083763-bbaf-0028-1a50-000000000007 15494 1726853334.66095: variable 'ansible_search_path' from source: unknown 15494 1726853334.66098: calling self._execute() 15494 1726853334.66100: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853334.66103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853334.66105: variable 'omit' from source: magic vars 15494 1726853334.66195: variable 'omit' from source: magic vars 15494 1726853334.66236: variable 'omit' from source: magic vars 15494 1726853334.66276: variable 'omit' from source: magic vars 15494 1726853334.66324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853334.66369: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853334.66398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853334.66421: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853334.66443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853334.66477: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853334.66486: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853334.66493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853334.66594: Set connection var ansible_connection to ssh 15494 1726853334.66605: Set connection var ansible_pipelining to False 15494 1726853334.66614: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853334.66620: Set connection var ansible_shell_type to sh 15494 1726853334.66631: Set connection var ansible_timeout to 10 15494 1726853334.66642: Set connection var ansible_shell_executable to /bin/sh 15494 1726853334.66674: variable 'ansible_shell_executable' from source: unknown 15494 1726853334.66681: variable 'ansible_connection' from source: unknown 15494 1726853334.66688: variable 'ansible_module_compression' from source: unknown 15494 1726853334.66693: variable 'ansible_shell_type' from source: unknown 15494 1726853334.66698: variable 'ansible_shell_executable' from source: unknown 15494 1726853334.66704: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853334.66711: variable 'ansible_pipelining' from source: unknown 15494 1726853334.66717: variable 'ansible_timeout' from source: unknown 15494 1726853334.66723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853334.66861: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853334.66977: variable 'omit' from source: magic vars 15494 1726853334.66980: starting attempt loop 15494 1726853334.66983: running the handler 15494 1726853334.66985: handler run complete 15494 1726853334.66987: attempt loop complete, returning result 15494 1726853334.66989: _execute() done 15494 1726853334.66991: dumping result to json 15494 1726853334.66993: done dumping result, returning 15494 1726853334.66995: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [02083763-bbaf-0028-1a50-000000000007] 15494 1726853334.66997: sending task result for task 02083763-bbaf-0028-1a50-000000000007 15494 1726853334.67060: done sending task result for task 02083763-bbaf-0028-1a50-000000000007 15494 1726853334.67062: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 15494 1726853334.67129: no more pending results, returning what we have 15494 1726853334.67132: results queue empty 15494 1726853334.67133: checking for any_errors_fatal 15494 1726853334.67139: done checking for any_errors_fatal 15494 1726853334.67139: checking for max_fail_percentage 15494 1726853334.67141: done checking for max_fail_percentage 15494 1726853334.67142: checking to see if all hosts have failed and the running result is not ok 15494 1726853334.67142: done checking to see if all hosts have failed 15494 1726853334.67143: getting the remaining hosts for this loop 15494 1726853334.67144: done getting the remaining hosts for this loop 15494 1726853334.67148: getting the next task for host managed_node1 15494 1726853334.67155: done getting next task for host managed_node1 15494 1726853334.67159: ^ task is: TASK: meta (flush_handlers) 15494 1726853334.67161: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853334.67165: getting variables 15494 1726853334.67166: in VariableManager get_vars() 15494 1726853334.67195: Calling all_inventory to load vars for managed_node1 15494 1726853334.67197: Calling groups_inventory to load vars for managed_node1 15494 1726853334.67200: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853334.67210: Calling all_plugins_play to load vars for managed_node1 15494 1726853334.67213: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853334.67216: Calling groups_plugins_play to load vars for managed_node1 15494 1726853334.67353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853334.67603: done with get_vars() 15494 1726853334.67610: done getting variables 15494 1726853334.67652: in VariableManager get_vars() 15494 1726853334.67657: Calling all_inventory to load vars for managed_node1 15494 1726853334.67659: Calling groups_inventory to load vars for managed_node1 15494 1726853334.67660: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853334.67663: Calling all_plugins_play to load vars for managed_node1 15494 1726853334.67664: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853334.67666: Calling groups_plugins_play to load vars for managed_node1 15494 1726853334.67744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853334.67848: done with get_vars() 15494 1726853334.67857: done queuing things up, now waiting for results queue to drain 15494 1726853334.67859: results queue empty 15494 1726853334.67859: checking for any_errors_fatal 15494 1726853334.67860: done checking for any_errors_fatal 15494 1726853334.67861: checking for max_fail_percentage 15494 1726853334.67861: done checking for max_fail_percentage 15494 1726853334.67862: checking to see if all hosts have failed and the running result is not ok 15494 1726853334.67862: done checking to see if all hosts have failed 15494 1726853334.67863: getting the remaining hosts for this loop 15494 1726853334.67863: done getting the remaining hosts for this loop 15494 1726853334.67864: getting the next task for host managed_node1 15494 1726853334.67867: done getting next task for host managed_node1 15494 1726853334.67868: ^ task is: TASK: meta (flush_handlers) 15494 1726853334.67869: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853334.67876: getting variables 15494 1726853334.67876: in VariableManager get_vars() 15494 1726853334.67881: Calling all_inventory to load vars for managed_node1 15494 1726853334.67882: Calling groups_inventory to load vars for managed_node1 15494 1726853334.67884: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853334.67886: Calling all_plugins_play to load vars for managed_node1 15494 1726853334.67888: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853334.67889: Calling groups_plugins_play to load vars for managed_node1 15494 1726853334.67966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853334.68084: done with get_vars() 15494 1726853334.68089: done getting variables 15494 1726853334.68116: in VariableManager get_vars() 15494 1726853334.68122: Calling all_inventory to load vars for managed_node1 15494 1726853334.68123: Calling groups_inventory to load vars for managed_node1 15494 1726853334.68125: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853334.68127: Calling all_plugins_play to load vars for managed_node1 15494 1726853334.68129: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853334.68130: Calling groups_plugins_play to load vars for managed_node1 15494 1726853334.68207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853334.68315: done with get_vars() 15494 1726853334.68322: done queuing things up, now waiting for results queue to drain 15494 1726853334.68323: results queue empty 15494 1726853334.68323: checking for any_errors_fatal 15494 1726853334.68324: done checking for any_errors_fatal 15494 1726853334.68324: checking for max_fail_percentage 15494 1726853334.68325: done checking for max_fail_percentage 15494 1726853334.68325: checking to see if all hosts have failed and the running result is not ok 15494 1726853334.68326: done checking to see if all hosts have failed 15494 1726853334.68326: getting the remaining hosts for this loop 15494 1726853334.68327: done getting the remaining hosts for this loop 15494 1726853334.68328: getting the next task for host managed_node1 15494 1726853334.68330: done getting next task for host managed_node1 15494 1726853334.68330: ^ task is: None 15494 1726853334.68331: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853334.68332: done queuing things up, now waiting for results queue to drain 15494 1726853334.68333: results queue empty 15494 1726853334.68333: checking for any_errors_fatal 15494 1726853334.68334: done checking for any_errors_fatal 15494 1726853334.68335: checking for max_fail_percentage 15494 1726853334.68335: done checking for max_fail_percentage 15494 1726853334.68336: checking to see if all hosts have failed and the running result is not ok 15494 1726853334.68336: done checking to see if all hosts have failed 15494 1726853334.68337: getting the next task for host managed_node1 15494 1726853334.68339: done getting next task for host managed_node1 15494 1726853334.68339: ^ task is: None 15494 1726853334.68340: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853334.68369: in VariableManager get_vars() 15494 1726853334.68384: done with get_vars() 15494 1726853334.68388: in VariableManager get_vars() 15494 1726853334.68393: done with get_vars() 15494 1726853334.68396: variable 'omit' from source: magic vars 15494 1726853334.68414: in VariableManager get_vars() 15494 1726853334.68419: done with get_vars() 15494 1726853334.68431: variable 'omit' from source: magic vars PLAY [Test configuring bridges] ************************************************ 15494 1726853334.68540: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15494 1726853334.68560: getting the remaining hosts for this loop 15494 1726853334.68561: done getting the remaining hosts for this loop 15494 1726853334.68563: getting the next task for host managed_node1 15494 1726853334.68565: done getting next task for host managed_node1 15494 1726853334.68566: ^ task is: TASK: Gathering Facts 15494 1726853334.68567: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853334.68568: getting variables 15494 1726853334.68568: in VariableManager get_vars() 15494 1726853334.68576: Calling all_inventory to load vars for managed_node1 15494 1726853334.68577: Calling groups_inventory to load vars for managed_node1 15494 1726853334.68578: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853334.68582: Calling all_plugins_play to load vars for managed_node1 15494 1726853334.68591: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853334.68593: Calling groups_plugins_play to load vars for managed_node1 15494 1726853334.68697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853334.68800: done with get_vars() 15494 1726853334.68806: done getting variables 15494 1726853334.68829: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 Friday 20 September 2024 13:28:54 -0400 (0:00:00.034) 0:00:03.304 ****** 15494 1726853334.68843: entering _queue_task() for managed_node1/gather_facts 15494 1726853334.69010: worker is 1 (out of 1 available) 15494 1726853334.69022: exiting _queue_task() for managed_node1/gather_facts 15494 1726853334.69034: done queuing things up, now waiting for results queue to drain 15494 1726853334.69035: waiting for pending results... 15494 1726853334.69197: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15494 1726853334.69247: in run() - task 02083763-bbaf-0028-1a50-0000000000da 15494 1726853334.69265: variable 'ansible_search_path' from source: unknown 15494 1726853334.69293: calling self._execute() 15494 1726853334.69344: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853334.69348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853334.69358: variable 'omit' from source: magic vars 15494 1726853334.69632: variable 'ansible_distribution_major_version' from source: facts 15494 1726853334.69641: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853334.69646: variable 'omit' from source: magic vars 15494 1726853334.69665: variable 'omit' from source: magic vars 15494 1726853334.69707: variable 'omit' from source: magic vars 15494 1726853334.69748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853334.69976: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853334.69979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853334.69982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853334.69984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853334.69986: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853334.69988: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853334.69990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853334.69993: Set connection var ansible_connection to ssh 15494 1726853334.69995: Set connection var ansible_pipelining to False 15494 1726853334.69997: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853334.69999: Set connection var ansible_shell_type to sh 15494 1726853334.70000: Set connection var ansible_timeout to 10 15494 1726853334.70006: Set connection var ansible_shell_executable to /bin/sh 15494 1726853334.70059: variable 'ansible_shell_executable' from source: unknown 15494 1726853334.70068: variable 'ansible_connection' from source: unknown 15494 1726853334.70079: variable 'ansible_module_compression' from source: unknown 15494 1726853334.70087: variable 'ansible_shell_type' from source: unknown 15494 1726853334.70094: variable 'ansible_shell_executable' from source: unknown 15494 1726853334.70101: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853334.70108: variable 'ansible_pipelining' from source: unknown 15494 1726853334.70114: variable 'ansible_timeout' from source: unknown 15494 1726853334.70121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853334.70308: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853334.70322: variable 'omit' from source: magic vars 15494 1726853334.70332: starting attempt loop 15494 1726853334.70338: running the handler 15494 1726853334.70369: variable 'ansible_facts' from source: unknown 15494 1726853334.70392: _low_level_execute_command(): starting 15494 1726853334.70475: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853334.70989: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853334.71090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853334.71111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853334.71124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853334.71199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853334.72887: stdout chunk (state=3): >>>/root <<< 15494 1726853334.72984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853334.73009: stderr chunk (state=3): >>><<< 15494 1726853334.73013: stdout chunk (state=3): >>><<< 15494 1726853334.73031: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853334.73042: _low_level_execute_command(): starting 15494 1726853334.73050: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853334.7303138-15692-135728118600685 `" && echo ansible-tmp-1726853334.7303138-15692-135728118600685="` echo /root/.ansible/tmp/ansible-tmp-1726853334.7303138-15692-135728118600685 `" ) && sleep 0' 15494 1726853334.73569: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853334.73585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853334.73612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853334.73689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853334.75576: stdout chunk (state=3): >>>ansible-tmp-1726853334.7303138-15692-135728118600685=/root/.ansible/tmp/ansible-tmp-1726853334.7303138-15692-135728118600685 <<< 15494 1726853334.75734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853334.75737: stdout chunk (state=3): >>><<< 15494 1726853334.75739: stderr chunk (state=3): >>><<< 15494 1726853334.75759: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853334.7303138-15692-135728118600685=/root/.ansible/tmp/ansible-tmp-1726853334.7303138-15692-135728118600685 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853334.75976: variable 'ansible_module_compression' from source: unknown 15494 1726853334.75979: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15494 1726853334.75981: variable 'ansible_facts' from source: unknown 15494 1726853334.76134: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853334.7303138-15692-135728118600685/AnsiballZ_setup.py 15494 1726853334.76294: Sending initial data 15494 1726853334.76302: Sent initial data (154 bytes) 15494 1726853334.76865: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853334.76974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853334.76998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853334.77015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853334.77087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853334.78710: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853334.78772: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853334.78896: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp_yflhf94 /root/.ansible/tmp/ansible-tmp-1726853334.7303138-15692-135728118600685/AnsiballZ_setup.py <<< 15494 1726853334.78900: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853334.7303138-15692-135728118600685/AnsiballZ_setup.py" <<< 15494 1726853334.78927: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp_yflhf94" to remote "/root/.ansible/tmp/ansible-tmp-1726853334.7303138-15692-135728118600685/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853334.7303138-15692-135728118600685/AnsiballZ_setup.py" <<< 15494 1726853334.81569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853334.81583: stdout chunk (state=3): >>><<< 15494 1726853334.81595: stderr chunk (state=3): >>><<< 15494 1726853334.81778: done transferring module to remote 15494 1726853334.81781: _low_level_execute_command(): starting 15494 1726853334.81784: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853334.7303138-15692-135728118600685/ /root/.ansible/tmp/ansible-tmp-1726853334.7303138-15692-135728118600685/AnsiballZ_setup.py && sleep 0' 15494 1726853334.82915: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853334.82919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853334.83012: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853334.83176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853334.83194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853334.83217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853334.83406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853334.85237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853334.85316: stdout chunk (state=3): >>><<< 15494 1726853334.85319: stderr chunk (state=3): >>><<< 15494 1726853334.85334: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853334.85352: _low_level_execute_command(): starting 15494 1726853334.85363: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853334.7303138-15692-135728118600685/AnsiballZ_setup.py && sleep 0' 15494 1726853334.86807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853334.86864: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853334.86954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853334.86975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853334.87282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853334.87321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853335.49446: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.57763671875, "5m": 0.341796875, "15m": 0.14794921875}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "28", "second": "55", "epoch": "1726853335", "epoch_int": "1726853335", "date": "2024-09-20", "time": "13:28:55", "iso8601_micro": "2024-09-20T17:28:55.147848Z", "iso8601": "2024-09-20T17:28:55Z", "iso8601_basic": "20240920T132855147848", "iso8601_basic_short": "20240920T132855", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2969, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 562, "free": 2969}, "nocache": {"free": 3306, "used": 225}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 501, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797318656, "block_size": 4096, "block_total": 65519099, "block_available": 63915361, "block_used": 1603738, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15494 1726853335.51363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853335.51367: stdout chunk (state=3): >>><<< 15494 1726853335.51370: stderr chunk (state=3): >>><<< 15494 1726853335.51477: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_is_chroot": false, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.57763671875, "5m": 0.341796875, "15m": 0.14794921875}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "28", "second": "55", "epoch": "1726853335", "epoch_int": "1726853335", "date": "2024-09-20", "time": "13:28:55", "iso8601_micro": "2024-09-20T17:28:55.147848Z", "iso8601": "2024-09-20T17:28:55Z", "iso8601_basic": "20240920T132855147848", "iso8601_basic_short": "20240920T132855", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2969, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 562, "free": 2969}, "nocache": {"free": 3306, "used": 225}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 501, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797318656, "block_size": 4096, "block_total": 65519099, "block_available": 63915361, "block_used": 1603738, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853335.51940: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853334.7303138-15692-135728118600685/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853335.52083: _low_level_execute_command(): starting 15494 1726853335.52087: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853334.7303138-15692-135728118600685/ > /dev/null 2>&1 && sleep 0' 15494 1726853335.53620: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853335.53906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853335.53942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853335.55806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853335.55810: stdout chunk (state=3): >>><<< 15494 1726853335.55818: stderr chunk (state=3): >>><<< 15494 1726853335.55837: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853335.55846: handler run complete 15494 1726853335.55995: variable 'ansible_facts' from source: unknown 15494 1726853335.56124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853335.56910: variable 'ansible_facts' from source: unknown 15494 1726853335.57031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853335.57188: attempt loop complete, returning result 15494 1726853335.57204: _execute() done 15494 1726853335.57212: dumping result to json 15494 1726853335.57332: done dumping result, returning 15494 1726853335.57336: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-0028-1a50-0000000000da] 15494 1726853335.57338: sending task result for task 02083763-bbaf-0028-1a50-0000000000da ok: [managed_node1] 15494 1726853335.58882: done sending task result for task 02083763-bbaf-0028-1a50-0000000000da 15494 1726853335.58885: WORKER PROCESS EXITING 15494 1726853335.58893: no more pending results, returning what we have 15494 1726853335.58896: results queue empty 15494 1726853335.58897: checking for any_errors_fatal 15494 1726853335.58898: done checking for any_errors_fatal 15494 1726853335.58899: checking for max_fail_percentage 15494 1726853335.58900: done checking for max_fail_percentage 15494 1726853335.58901: checking to see if all hosts have failed and the running result is not ok 15494 1726853335.58901: done checking to see if all hosts have failed 15494 1726853335.58902: getting the remaining hosts for this loop 15494 1726853335.58903: done getting the remaining hosts for this loop 15494 1726853335.58907: getting the next task for host managed_node1 15494 1726853335.59006: done getting next task for host managed_node1 15494 1726853335.59009: ^ task is: TASK: meta (flush_handlers) 15494 1726853335.59010: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853335.59014: getting variables 15494 1726853335.59015: in VariableManager get_vars() 15494 1726853335.59044: Calling all_inventory to load vars for managed_node1 15494 1726853335.59049: Calling groups_inventory to load vars for managed_node1 15494 1726853335.59053: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853335.59062: Calling all_plugins_play to load vars for managed_node1 15494 1726853335.59065: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853335.59067: Calling groups_plugins_play to load vars for managed_node1 15494 1726853335.59427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853335.59701: done with get_vars() 15494 1726853335.59713: done getting variables 15494 1726853335.59796: in VariableManager get_vars() 15494 1726853335.59806: Calling all_inventory to load vars for managed_node1 15494 1726853335.59808: Calling groups_inventory to load vars for managed_node1 15494 1726853335.59811: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853335.59815: Calling all_plugins_play to load vars for managed_node1 15494 1726853335.59817: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853335.59820: Calling groups_plugins_play to load vars for managed_node1 15494 1726853335.60001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853335.60205: done with get_vars() 15494 1726853335.60226: done queuing things up, now waiting for results queue to drain 15494 1726853335.60228: results queue empty 15494 1726853335.60229: checking for any_errors_fatal 15494 1726853335.60232: done checking for any_errors_fatal 15494 1726853335.60233: checking for max_fail_percentage 15494 1726853335.60234: done checking for max_fail_percentage 15494 1726853335.60234: checking to see if all hosts have failed and the running result is not ok 15494 1726853335.60235: done checking to see if all hosts have failed 15494 1726853335.60236: getting the remaining hosts for this loop 15494 1726853335.60241: done getting the remaining hosts for this loop 15494 1726853335.60243: getting the next task for host managed_node1 15494 1726853335.60249: done getting next task for host managed_node1 15494 1726853335.60252: ^ task is: TASK: Set interface={{ interface }} 15494 1726853335.60253: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853335.60255: getting variables 15494 1726853335.60256: in VariableManager get_vars() 15494 1726853335.60263: Calling all_inventory to load vars for managed_node1 15494 1726853335.60266: Calling groups_inventory to load vars for managed_node1 15494 1726853335.60268: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853335.60274: Calling all_plugins_play to load vars for managed_node1 15494 1726853335.60276: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853335.60279: Calling groups_plugins_play to load vars for managed_node1 15494 1726853335.60429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853335.60639: done with get_vars() 15494 1726853335.60656: done getting variables 15494 1726853335.60697: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15494 1726853335.60893: variable 'interface' from source: play vars TASK [Set interface=LSR-TST-br31] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:9 Friday 20 September 2024 13:28:55 -0400 (0:00:00.920) 0:00:04.225 ****** 15494 1726853335.60932: entering _queue_task() for managed_node1/set_fact 15494 1726853335.61504: worker is 1 (out of 1 available) 15494 1726853335.61602: exiting _queue_task() for managed_node1/set_fact 15494 1726853335.61615: done queuing things up, now waiting for results queue to drain 15494 1726853335.61616: waiting for pending results... 15494 1726853335.61933: running TaskExecutor() for managed_node1/TASK: Set interface=LSR-TST-br31 15494 1726853335.62283: in run() - task 02083763-bbaf-0028-1a50-00000000000b 15494 1726853335.62389: variable 'ansible_search_path' from source: unknown 15494 1726853335.62395: calling self._execute() 15494 1726853335.62576: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853335.62580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853335.62583: variable 'omit' from source: magic vars 15494 1726853335.63195: variable 'ansible_distribution_major_version' from source: facts 15494 1726853335.63211: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853335.63221: variable 'omit' from source: magic vars 15494 1726853335.63259: variable 'omit' from source: magic vars 15494 1726853335.63577: variable 'interface' from source: play vars 15494 1726853335.63580: variable 'interface' from source: play vars 15494 1726853335.63594: variable 'omit' from source: magic vars 15494 1726853335.63654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853335.63761: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853335.63859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853335.63898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853335.63956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853335.64152: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853335.64156: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853335.64159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853335.64380: Set connection var ansible_connection to ssh 15494 1726853335.64383: Set connection var ansible_pipelining to False 15494 1726853335.64386: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853335.64388: Set connection var ansible_shell_type to sh 15494 1726853335.64390: Set connection var ansible_timeout to 10 15494 1726853335.64392: Set connection var ansible_shell_executable to /bin/sh 15494 1726853335.64394: variable 'ansible_shell_executable' from source: unknown 15494 1726853335.64488: variable 'ansible_connection' from source: unknown 15494 1726853335.64494: variable 'ansible_module_compression' from source: unknown 15494 1726853335.64501: variable 'ansible_shell_type' from source: unknown 15494 1726853335.64597: variable 'ansible_shell_executable' from source: unknown 15494 1726853335.64601: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853335.64603: variable 'ansible_pipelining' from source: unknown 15494 1726853335.64605: variable 'ansible_timeout' from source: unknown 15494 1726853335.64607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853335.64850: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853335.64902: variable 'omit' from source: magic vars 15494 1726853335.64913: starting attempt loop 15494 1726853335.64963: running the handler 15494 1726853335.64984: handler run complete 15494 1726853335.64999: attempt loop complete, returning result 15494 1726853335.65006: _execute() done 15494 1726853335.65040: dumping result to json 15494 1726853335.65043: done dumping result, returning 15494 1726853335.65045: done running TaskExecutor() for managed_node1/TASK: Set interface=LSR-TST-br31 [02083763-bbaf-0028-1a50-00000000000b] 15494 1726853335.65047: sending task result for task 02083763-bbaf-0028-1a50-00000000000b ok: [managed_node1] => { "ansible_facts": { "interface": "LSR-TST-br31" }, "changed": false } 15494 1726853335.65220: no more pending results, returning what we have 15494 1726853335.65223: results queue empty 15494 1726853335.65224: checking for any_errors_fatal 15494 1726853335.65227: done checking for any_errors_fatal 15494 1726853335.65227: checking for max_fail_percentage 15494 1726853335.65229: done checking for max_fail_percentage 15494 1726853335.65229: checking to see if all hosts have failed and the running result is not ok 15494 1726853335.65230: done checking to see if all hosts have failed 15494 1726853335.65231: getting the remaining hosts for this loop 15494 1726853335.65232: done getting the remaining hosts for this loop 15494 1726853335.65236: getting the next task for host managed_node1 15494 1726853335.65242: done getting next task for host managed_node1 15494 1726853335.65245: ^ task is: TASK: Include the task 'show_interfaces.yml' 15494 1726853335.65247: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853335.65252: getting variables 15494 1726853335.65254: in VariableManager get_vars() 15494 1726853335.65402: Calling all_inventory to load vars for managed_node1 15494 1726853335.65405: Calling groups_inventory to load vars for managed_node1 15494 1726853335.65409: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853335.65421: Calling all_plugins_play to load vars for managed_node1 15494 1726853335.65425: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853335.65428: Calling groups_plugins_play to load vars for managed_node1 15494 1726853335.65841: done sending task result for task 02083763-bbaf-0028-1a50-00000000000b 15494 1726853335.65844: WORKER PROCESS EXITING 15494 1726853335.65866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853335.66081: done with get_vars() 15494 1726853335.66090: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:12 Friday 20 September 2024 13:28:55 -0400 (0:00:00.052) 0:00:04.277 ****** 15494 1726853335.66186: entering _queue_task() for managed_node1/include_tasks 15494 1726853335.66438: worker is 1 (out of 1 available) 15494 1726853335.66450: exiting _queue_task() for managed_node1/include_tasks 15494 1726853335.66462: done queuing things up, now waiting for results queue to drain 15494 1726853335.66464: waiting for pending results... 15494 1726853335.66727: running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' 15494 1726853335.66915: in run() - task 02083763-bbaf-0028-1a50-00000000000c 15494 1726853335.66938: variable 'ansible_search_path' from source: unknown 15494 1726853335.66980: calling self._execute() 15494 1726853335.67099: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853335.67277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853335.67280: variable 'omit' from source: magic vars 15494 1726853335.67981: variable 'ansible_distribution_major_version' from source: facts 15494 1726853335.67987: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853335.67990: _execute() done 15494 1726853335.67993: dumping result to json 15494 1726853335.67995: done dumping result, returning 15494 1726853335.68096: done running TaskExecutor() for managed_node1/TASK: Include the task 'show_interfaces.yml' [02083763-bbaf-0028-1a50-00000000000c] 15494 1726853335.68177: sending task result for task 02083763-bbaf-0028-1a50-00000000000c 15494 1726853335.68283: no more pending results, returning what we have 15494 1726853335.68289: in VariableManager get_vars() 15494 1726853335.68334: Calling all_inventory to load vars for managed_node1 15494 1726853335.68338: Calling groups_inventory to load vars for managed_node1 15494 1726853335.68341: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853335.68355: Calling all_plugins_play to load vars for managed_node1 15494 1726853335.68358: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853335.68361: Calling groups_plugins_play to load vars for managed_node1 15494 1726853335.68957: done sending task result for task 02083763-bbaf-0028-1a50-00000000000c 15494 1726853335.68960: WORKER PROCESS EXITING 15494 1726853335.69035: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853335.69250: done with get_vars() 15494 1726853335.69256: variable 'ansible_search_path' from source: unknown 15494 1726853335.69269: we have included files to process 15494 1726853335.69272: generating all_blocks data 15494 1726853335.69274: done generating all_blocks data 15494 1726853335.69275: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15494 1726853335.69276: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15494 1726853335.69278: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15494 1726853335.69462: in VariableManager get_vars() 15494 1726853335.69481: done with get_vars() 15494 1726853335.69596: done processing included file 15494 1726853335.69598: iterating over new_blocks loaded from include file 15494 1726853335.69599: in VariableManager get_vars() 15494 1726853335.69610: done with get_vars() 15494 1726853335.69611: filtering new block on tags 15494 1726853335.69638: done filtering new block on tags 15494 1726853335.69640: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node1 15494 1726853335.69645: extending task lists for all hosts with included blocks 15494 1726853335.69707: done extending task lists 15494 1726853335.69708: done processing included files 15494 1726853335.69709: results queue empty 15494 1726853335.69710: checking for any_errors_fatal 15494 1726853335.69713: done checking for any_errors_fatal 15494 1726853335.69714: checking for max_fail_percentage 15494 1726853335.69715: done checking for max_fail_percentage 15494 1726853335.69716: checking to see if all hosts have failed and the running result is not ok 15494 1726853335.69717: done checking to see if all hosts have failed 15494 1726853335.69717: getting the remaining hosts for this loop 15494 1726853335.69719: done getting the remaining hosts for this loop 15494 1726853335.69721: getting the next task for host managed_node1 15494 1726853335.69725: done getting next task for host managed_node1 15494 1726853335.69727: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 15494 1726853335.69729: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853335.69731: getting variables 15494 1726853335.69732: in VariableManager get_vars() 15494 1726853335.69749: Calling all_inventory to load vars for managed_node1 15494 1726853335.69752: Calling groups_inventory to load vars for managed_node1 15494 1726853335.69754: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853335.69759: Calling all_plugins_play to load vars for managed_node1 15494 1726853335.69761: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853335.69764: Calling groups_plugins_play to load vars for managed_node1 15494 1726853335.69944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853335.70144: done with get_vars() 15494 1726853335.70152: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 13:28:55 -0400 (0:00:00.040) 0:00:04.318 ****** 15494 1726853335.70231: entering _queue_task() for managed_node1/include_tasks 15494 1726853335.70538: worker is 1 (out of 1 available) 15494 1726853335.70549: exiting _queue_task() for managed_node1/include_tasks 15494 1726853335.70562: done queuing things up, now waiting for results queue to drain 15494 1726853335.70563: waiting for pending results... 15494 1726853335.70795: running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' 15494 1726853335.70901: in run() - task 02083763-bbaf-0028-1a50-0000000000ee 15494 1726853335.70925: variable 'ansible_search_path' from source: unknown 15494 1726853335.70934: variable 'ansible_search_path' from source: unknown 15494 1726853335.70983: calling self._execute() 15494 1726853335.71083: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853335.71096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853335.71111: variable 'omit' from source: magic vars 15494 1726853335.71504: variable 'ansible_distribution_major_version' from source: facts 15494 1726853335.71564: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853335.71568: _execute() done 15494 1726853335.71572: dumping result to json 15494 1726853335.71575: done dumping result, returning 15494 1726853335.71578: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_current_interfaces.yml' [02083763-bbaf-0028-1a50-0000000000ee] 15494 1726853335.71580: sending task result for task 02083763-bbaf-0028-1a50-0000000000ee 15494 1726853335.71732: no more pending results, returning what we have 15494 1726853335.71739: in VariableManager get_vars() 15494 1726853335.71775: Calling all_inventory to load vars for managed_node1 15494 1726853335.71778: Calling groups_inventory to load vars for managed_node1 15494 1726853335.71782: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853335.71796: Calling all_plugins_play to load vars for managed_node1 15494 1726853335.71799: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853335.71802: Calling groups_plugins_play to load vars for managed_node1 15494 1726853335.72175: done sending task result for task 02083763-bbaf-0028-1a50-0000000000ee 15494 1726853335.72179: WORKER PROCESS EXITING 15494 1726853335.72206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853335.72383: done with get_vars() 15494 1726853335.72391: variable 'ansible_search_path' from source: unknown 15494 1726853335.72392: variable 'ansible_search_path' from source: unknown 15494 1726853335.72436: we have included files to process 15494 1726853335.72437: generating all_blocks data 15494 1726853335.72439: done generating all_blocks data 15494 1726853335.72440: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15494 1726853335.72441: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15494 1726853335.72444: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15494 1726853335.72782: done processing included file 15494 1726853335.72783: iterating over new_blocks loaded from include file 15494 1726853335.72785: in VariableManager get_vars() 15494 1726853335.72797: done with get_vars() 15494 1726853335.72799: filtering new block on tags 15494 1726853335.72814: done filtering new block on tags 15494 1726853335.72816: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node1 15494 1726853335.72821: extending task lists for all hosts with included blocks 15494 1726853335.72928: done extending task lists 15494 1726853335.72929: done processing included files 15494 1726853335.72930: results queue empty 15494 1726853335.72931: checking for any_errors_fatal 15494 1726853335.72934: done checking for any_errors_fatal 15494 1726853335.72935: checking for max_fail_percentage 15494 1726853335.72936: done checking for max_fail_percentage 15494 1726853335.72937: checking to see if all hosts have failed and the running result is not ok 15494 1726853335.72938: done checking to see if all hosts have failed 15494 1726853335.72939: getting the remaining hosts for this loop 15494 1726853335.72940: done getting the remaining hosts for this loop 15494 1726853335.72952: getting the next task for host managed_node1 15494 1726853335.72957: done getting next task for host managed_node1 15494 1726853335.72959: ^ task is: TASK: Gather current interface info 15494 1726853335.72962: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853335.72965: getting variables 15494 1726853335.72966: in VariableManager get_vars() 15494 1726853335.72975: Calling all_inventory to load vars for managed_node1 15494 1726853335.72977: Calling groups_inventory to load vars for managed_node1 15494 1726853335.72979: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853335.72984: Calling all_plugins_play to load vars for managed_node1 15494 1726853335.72986: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853335.72988: Calling groups_plugins_play to load vars for managed_node1 15494 1726853335.73122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853335.73318: done with get_vars() 15494 1726853335.73326: done getting variables 15494 1726853335.73361: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 13:28:55 -0400 (0:00:00.031) 0:00:04.349 ****** 15494 1726853335.73403: entering _queue_task() for managed_node1/command 15494 1726853335.73781: worker is 1 (out of 1 available) 15494 1726853335.73792: exiting _queue_task() for managed_node1/command 15494 1726853335.73802: done queuing things up, now waiting for results queue to drain 15494 1726853335.73803: waiting for pending results... 15494 1726853335.73974: running TaskExecutor() for managed_node1/TASK: Gather current interface info 15494 1726853335.74093: in run() - task 02083763-bbaf-0028-1a50-0000000000fd 15494 1726853335.74137: variable 'ansible_search_path' from source: unknown 15494 1726853335.74148: variable 'ansible_search_path' from source: unknown 15494 1726853335.74168: calling self._execute() 15494 1726853335.74259: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853335.74353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853335.74363: variable 'omit' from source: magic vars 15494 1726853335.74648: variable 'ansible_distribution_major_version' from source: facts 15494 1726853335.74663: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853335.74675: variable 'omit' from source: magic vars 15494 1726853335.74726: variable 'omit' from source: magic vars 15494 1726853335.74763: variable 'omit' from source: magic vars 15494 1726853335.74819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853335.74860: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853335.74888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853335.74926: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853335.74943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853335.74979: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853335.74987: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853335.74994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853335.75128: Set connection var ansible_connection to ssh 15494 1726853335.75132: Set connection var ansible_pipelining to False 15494 1726853335.75134: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853335.75139: Set connection var ansible_shell_type to sh 15494 1726853335.75175: Set connection var ansible_timeout to 10 15494 1726853335.75179: Set connection var ansible_shell_executable to /bin/sh 15494 1726853335.75189: variable 'ansible_shell_executable' from source: unknown 15494 1726853335.75197: variable 'ansible_connection' from source: unknown 15494 1726853335.75205: variable 'ansible_module_compression' from source: unknown 15494 1726853335.75213: variable 'ansible_shell_type' from source: unknown 15494 1726853335.75237: variable 'ansible_shell_executable' from source: unknown 15494 1726853335.75241: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853335.75347: variable 'ansible_pipelining' from source: unknown 15494 1726853335.75350: variable 'ansible_timeout' from source: unknown 15494 1726853335.75353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853335.75410: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853335.75427: variable 'omit' from source: magic vars 15494 1726853335.75437: starting attempt loop 15494 1726853335.75454: running the handler 15494 1726853335.75477: _low_level_execute_command(): starting 15494 1726853335.75489: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853335.76293: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853335.76400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853335.76412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853335.76437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853335.76457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853335.76546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853335.78224: stdout chunk (state=3): >>>/root <<< 15494 1726853335.78328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853335.78442: stderr chunk (state=3): >>><<< 15494 1726853335.78445: stdout chunk (state=3): >>><<< 15494 1726853335.78464: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853335.78511: _low_level_execute_command(): starting 15494 1726853335.78515: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853335.7847009-15752-189292313681843 `" && echo ansible-tmp-1726853335.7847009-15752-189292313681843="` echo /root/.ansible/tmp/ansible-tmp-1726853335.7847009-15752-189292313681843 `" ) && sleep 0' 15494 1726853335.79125: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853335.79180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853335.79259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853335.79296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853335.79363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853335.81244: stdout chunk (state=3): >>>ansible-tmp-1726853335.7847009-15752-189292313681843=/root/.ansible/tmp/ansible-tmp-1726853335.7847009-15752-189292313681843 <<< 15494 1726853335.81430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853335.81433: stdout chunk (state=3): >>><<< 15494 1726853335.81435: stderr chunk (state=3): >>><<< 15494 1726853335.81481: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853335.7847009-15752-189292313681843=/root/.ansible/tmp/ansible-tmp-1726853335.7847009-15752-189292313681843 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853335.81580: variable 'ansible_module_compression' from source: unknown 15494 1726853335.81608: ANSIBALLZ: Using generic lock for ansible.legacy.command 15494 1726853335.81618: ANSIBALLZ: Acquiring lock 15494 1726853335.81629: ANSIBALLZ: Lock acquired: 140002372342736 15494 1726853335.81638: ANSIBALLZ: Creating module 15494 1726853335.92652: ANSIBALLZ: Writing module into payload 15494 1726853335.92714: ANSIBALLZ: Writing module 15494 1726853335.92729: ANSIBALLZ: Renaming module 15494 1726853335.92735: ANSIBALLZ: Done creating module 15494 1726853335.92751: variable 'ansible_facts' from source: unknown 15494 1726853335.92795: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853335.7847009-15752-189292313681843/AnsiballZ_command.py 15494 1726853335.92891: Sending initial data 15494 1726853335.92894: Sent initial data (156 bytes) 15494 1726853335.93327: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853335.93330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853335.93332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 15494 1726853335.93334: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853335.93336: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853335.93393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853335.93399: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853335.93401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853335.93444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853335.95107: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853335.95148: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853335.95338: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp74912n30 /root/.ansible/tmp/ansible-tmp-1726853335.7847009-15752-189292313681843/AnsiballZ_command.py <<< 15494 1726853335.95341: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853335.7847009-15752-189292313681843/AnsiballZ_command.py" <<< 15494 1726853335.95343: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp74912n30" to remote "/root/.ansible/tmp/ansible-tmp-1726853335.7847009-15752-189292313681843/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853335.7847009-15752-189292313681843/AnsiballZ_command.py" <<< 15494 1726853335.96442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853335.96445: stderr chunk (state=3): >>><<< 15494 1726853335.96447: stdout chunk (state=3): >>><<< 15494 1726853335.96475: done transferring module to remote 15494 1726853335.96485: _low_level_execute_command(): starting 15494 1726853335.96550: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853335.7847009-15752-189292313681843/ /root/.ansible/tmp/ansible-tmp-1726853335.7847009-15752-189292313681843/AnsiballZ_command.py && sleep 0' 15494 1726853335.97273: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853335.97387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853335.97407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853335.97411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853335.97559: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853335.99283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853335.99352: stderr chunk (state=3): >>><<< 15494 1726853335.99355: stdout chunk (state=3): >>><<< 15494 1726853335.99456: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853335.99459: _low_level_execute_command(): starting 15494 1726853335.99465: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853335.7847009-15752-189292313681843/AnsiballZ_command.py && sleep 0' 15494 1726853336.00332: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853336.00347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853336.00387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853336.00403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.00487: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.00514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853336.00536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853336.00558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853336.00675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853336.16062: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:28:56.156505", "end": "2024-09-20 13:28:56.159890", "delta": "0:00:00.003385", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15494 1726853336.17518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853336.17553: stderr chunk (state=3): >>><<< 15494 1726853336.17557: stdout chunk (state=3): >>><<< 15494 1726853336.17574: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 13:28:56.156505", "end": "2024-09-20 13:28:56.159890", "delta": "0:00:00.003385", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853336.17603: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853335.7847009-15752-189292313681843/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853336.17610: _low_level_execute_command(): starting 15494 1726853336.17615: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853335.7847009-15752-189292313681843/ > /dev/null 2>&1 && sleep 0' 15494 1726853336.18120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853336.18123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853336.18125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.18127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853336.18134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.18203: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853336.18207: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853336.18236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853336.20021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853336.20046: stderr chunk (state=3): >>><<< 15494 1726853336.20061: stdout chunk (state=3): >>><<< 15494 1726853336.20077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853336.20081: handler run complete 15494 1726853336.20116: Evaluated conditional (False): False 15494 1726853336.20120: attempt loop complete, returning result 15494 1726853336.20122: _execute() done 15494 1726853336.20127: dumping result to json 15494 1726853336.20131: done dumping result, returning 15494 1726853336.20139: done running TaskExecutor() for managed_node1/TASK: Gather current interface info [02083763-bbaf-0028-1a50-0000000000fd] 15494 1726853336.20143: sending task result for task 02083763-bbaf-0028-1a50-0000000000fd 15494 1726853336.20239: done sending task result for task 02083763-bbaf-0028-1a50-0000000000fd 15494 1726853336.20241: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003385", "end": "2024-09-20 13:28:56.159890", "rc": 0, "start": "2024-09-20 13:28:56.156505" } STDOUT: bonding_masters eth0 lo 15494 1726853336.20319: no more pending results, returning what we have 15494 1726853336.20322: results queue empty 15494 1726853336.20323: checking for any_errors_fatal 15494 1726853336.20324: done checking for any_errors_fatal 15494 1726853336.20325: checking for max_fail_percentage 15494 1726853336.20326: done checking for max_fail_percentage 15494 1726853336.20327: checking to see if all hosts have failed and the running result is not ok 15494 1726853336.20328: done checking to see if all hosts have failed 15494 1726853336.20328: getting the remaining hosts for this loop 15494 1726853336.20330: done getting the remaining hosts for this loop 15494 1726853336.20333: getting the next task for host managed_node1 15494 1726853336.20339: done getting next task for host managed_node1 15494 1726853336.20341: ^ task is: TASK: Set current_interfaces 15494 1726853336.20345: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853336.20348: getting variables 15494 1726853336.20358: in VariableManager get_vars() 15494 1726853336.20390: Calling all_inventory to load vars for managed_node1 15494 1726853336.20392: Calling groups_inventory to load vars for managed_node1 15494 1726853336.20395: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853336.20406: Calling all_plugins_play to load vars for managed_node1 15494 1726853336.20408: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853336.20410: Calling groups_plugins_play to load vars for managed_node1 15494 1726853336.20617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853336.20736: done with get_vars() 15494 1726853336.20743: done getting variables 15494 1726853336.20788: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 13:28:56 -0400 (0:00:00.474) 0:00:04.823 ****** 15494 1726853336.20810: entering _queue_task() for managed_node1/set_fact 15494 1726853336.20994: worker is 1 (out of 1 available) 15494 1726853336.21005: exiting _queue_task() for managed_node1/set_fact 15494 1726853336.21019: done queuing things up, now waiting for results queue to drain 15494 1726853336.21020: waiting for pending results... 15494 1726853336.21200: running TaskExecutor() for managed_node1/TASK: Set current_interfaces 15494 1726853336.21268: in run() - task 02083763-bbaf-0028-1a50-0000000000fe 15494 1726853336.21278: variable 'ansible_search_path' from source: unknown 15494 1726853336.21282: variable 'ansible_search_path' from source: unknown 15494 1726853336.21302: calling self._execute() 15494 1726853336.21360: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.21364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.21377: variable 'omit' from source: magic vars 15494 1726853336.21628: variable 'ansible_distribution_major_version' from source: facts 15494 1726853336.21637: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853336.21642: variable 'omit' from source: magic vars 15494 1726853336.21673: variable 'omit' from source: magic vars 15494 1726853336.21747: variable '_current_interfaces' from source: set_fact 15494 1726853336.21797: variable 'omit' from source: magic vars 15494 1726853336.21847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853336.21884: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853336.21910: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853336.21932: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853336.21944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853336.21983: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853336.21986: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.21991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.22072: Set connection var ansible_connection to ssh 15494 1726853336.22076: Set connection var ansible_pipelining to False 15494 1726853336.22082: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853336.22085: Set connection var ansible_shell_type to sh 15494 1726853336.22091: Set connection var ansible_timeout to 10 15494 1726853336.22098: Set connection var ansible_shell_executable to /bin/sh 15494 1726853336.22115: variable 'ansible_shell_executable' from source: unknown 15494 1726853336.22118: variable 'ansible_connection' from source: unknown 15494 1726853336.22121: variable 'ansible_module_compression' from source: unknown 15494 1726853336.22123: variable 'ansible_shell_type' from source: unknown 15494 1726853336.22125: variable 'ansible_shell_executable' from source: unknown 15494 1726853336.22127: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.22136: variable 'ansible_pipelining' from source: unknown 15494 1726853336.22138: variable 'ansible_timeout' from source: unknown 15494 1726853336.22140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.22261: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853336.22269: variable 'omit' from source: magic vars 15494 1726853336.22276: starting attempt loop 15494 1726853336.22279: running the handler 15494 1726853336.22287: handler run complete 15494 1726853336.22295: attempt loop complete, returning result 15494 1726853336.22302: _execute() done 15494 1726853336.22305: dumping result to json 15494 1726853336.22307: done dumping result, returning 15494 1726853336.22310: done running TaskExecutor() for managed_node1/TASK: Set current_interfaces [02083763-bbaf-0028-1a50-0000000000fe] 15494 1726853336.22312: sending task result for task 02083763-bbaf-0028-1a50-0000000000fe 15494 1726853336.22402: done sending task result for task 02083763-bbaf-0028-1a50-0000000000fe 15494 1726853336.22408: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 15494 1726853336.22466: no more pending results, returning what we have 15494 1726853336.22469: results queue empty 15494 1726853336.22470: checking for any_errors_fatal 15494 1726853336.22483: done checking for any_errors_fatal 15494 1726853336.22486: checking for max_fail_percentage 15494 1726853336.22487: done checking for max_fail_percentage 15494 1726853336.22488: checking to see if all hosts have failed and the running result is not ok 15494 1726853336.22489: done checking to see if all hosts have failed 15494 1726853336.22489: getting the remaining hosts for this loop 15494 1726853336.22491: done getting the remaining hosts for this loop 15494 1726853336.22495: getting the next task for host managed_node1 15494 1726853336.22502: done getting next task for host managed_node1 15494 1726853336.22505: ^ task is: TASK: Show current_interfaces 15494 1726853336.22507: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853336.22512: getting variables 15494 1726853336.22513: in VariableManager get_vars() 15494 1726853336.22536: Calling all_inventory to load vars for managed_node1 15494 1726853336.22539: Calling groups_inventory to load vars for managed_node1 15494 1726853336.22541: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853336.22550: Calling all_plugins_play to load vars for managed_node1 15494 1726853336.22552: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853336.22556: Calling groups_plugins_play to load vars for managed_node1 15494 1726853336.22727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853336.22943: done with get_vars() 15494 1726853336.22952: done getting variables 15494 1726853336.23020: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 13:28:56 -0400 (0:00:00.022) 0:00:04.846 ****** 15494 1726853336.23039: entering _queue_task() for managed_node1/debug 15494 1726853336.23041: Creating lock for debug 15494 1726853336.23283: worker is 1 (out of 1 available) 15494 1726853336.23298: exiting _queue_task() for managed_node1/debug 15494 1726853336.23309: done queuing things up, now waiting for results queue to drain 15494 1726853336.23310: waiting for pending results... 15494 1726853336.23483: running TaskExecutor() for managed_node1/TASK: Show current_interfaces 15494 1726853336.23569: in run() - task 02083763-bbaf-0028-1a50-0000000000ef 15494 1726853336.23582: variable 'ansible_search_path' from source: unknown 15494 1726853336.23586: variable 'ansible_search_path' from source: unknown 15494 1726853336.23613: calling self._execute() 15494 1726853336.23672: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.23677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.23685: variable 'omit' from source: magic vars 15494 1726853336.24044: variable 'ansible_distribution_major_version' from source: facts 15494 1726853336.24054: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853336.24057: variable 'omit' from source: magic vars 15494 1726853336.24060: variable 'omit' from source: magic vars 15494 1726853336.24278: variable 'current_interfaces' from source: set_fact 15494 1726853336.24281: variable 'omit' from source: magic vars 15494 1726853336.24283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853336.24285: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853336.24288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853336.24290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853336.24292: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853336.24294: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853336.24296: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.24298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.24386: Set connection var ansible_connection to ssh 15494 1726853336.24392: Set connection var ansible_pipelining to False 15494 1726853336.24396: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853336.24398: Set connection var ansible_shell_type to sh 15494 1726853336.24405: Set connection var ansible_timeout to 10 15494 1726853336.24412: Set connection var ansible_shell_executable to /bin/sh 15494 1726853336.24433: variable 'ansible_shell_executable' from source: unknown 15494 1726853336.24437: variable 'ansible_connection' from source: unknown 15494 1726853336.24439: variable 'ansible_module_compression' from source: unknown 15494 1726853336.24442: variable 'ansible_shell_type' from source: unknown 15494 1726853336.24444: variable 'ansible_shell_executable' from source: unknown 15494 1726853336.24449: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.24451: variable 'ansible_pipelining' from source: unknown 15494 1726853336.24453: variable 'ansible_timeout' from source: unknown 15494 1726853336.24455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.24580: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853336.24590: variable 'omit' from source: magic vars 15494 1726853336.24595: starting attempt loop 15494 1726853336.24598: running the handler 15494 1726853336.24681: handler run complete 15494 1726853336.24684: attempt loop complete, returning result 15494 1726853336.24686: _execute() done 15494 1726853336.24688: dumping result to json 15494 1726853336.24690: done dumping result, returning 15494 1726853336.24692: done running TaskExecutor() for managed_node1/TASK: Show current_interfaces [02083763-bbaf-0028-1a50-0000000000ef] 15494 1726853336.24695: sending task result for task 02083763-bbaf-0028-1a50-0000000000ef 15494 1726853336.24950: done sending task result for task 02083763-bbaf-0028-1a50-0000000000ef 15494 1726853336.24953: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 15494 1726853336.24998: no more pending results, returning what we have 15494 1726853336.25002: results queue empty 15494 1726853336.25003: checking for any_errors_fatal 15494 1726853336.25006: done checking for any_errors_fatal 15494 1726853336.25007: checking for max_fail_percentage 15494 1726853336.25008: done checking for max_fail_percentage 15494 1726853336.25009: checking to see if all hosts have failed and the running result is not ok 15494 1726853336.25010: done checking to see if all hosts have failed 15494 1726853336.25010: getting the remaining hosts for this loop 15494 1726853336.25012: done getting the remaining hosts for this loop 15494 1726853336.25015: getting the next task for host managed_node1 15494 1726853336.25021: done getting next task for host managed_node1 15494 1726853336.25025: ^ task is: TASK: Include the task 'assert_device_absent.yml' 15494 1726853336.25027: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853336.25029: getting variables 15494 1726853336.25031: in VariableManager get_vars() 15494 1726853336.25054: Calling all_inventory to load vars for managed_node1 15494 1726853336.25057: Calling groups_inventory to load vars for managed_node1 15494 1726853336.25059: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853336.25068: Calling all_plugins_play to load vars for managed_node1 15494 1726853336.25070: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853336.25075: Calling groups_plugins_play to load vars for managed_node1 15494 1726853336.25520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853336.25650: done with get_vars() 15494 1726853336.25656: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:14 Friday 20 September 2024 13:28:56 -0400 (0:00:00.026) 0:00:04.872 ****** 15494 1726853336.25715: entering _queue_task() for managed_node1/include_tasks 15494 1726853336.25881: worker is 1 (out of 1 available) 15494 1726853336.25893: exiting _queue_task() for managed_node1/include_tasks 15494 1726853336.25905: done queuing things up, now waiting for results queue to drain 15494 1726853336.25907: waiting for pending results... 15494 1726853336.26052: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' 15494 1726853336.26108: in run() - task 02083763-bbaf-0028-1a50-00000000000d 15494 1726853336.26118: variable 'ansible_search_path' from source: unknown 15494 1726853336.26146: calling self._execute() 15494 1726853336.26205: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.26208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.26217: variable 'omit' from source: magic vars 15494 1726853336.26464: variable 'ansible_distribution_major_version' from source: facts 15494 1726853336.26476: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853336.26481: _execute() done 15494 1726853336.26484: dumping result to json 15494 1726853336.26488: done dumping result, returning 15494 1726853336.26495: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_device_absent.yml' [02083763-bbaf-0028-1a50-00000000000d] 15494 1726853336.26504: sending task result for task 02083763-bbaf-0028-1a50-00000000000d 15494 1726853336.26579: done sending task result for task 02083763-bbaf-0028-1a50-00000000000d 15494 1726853336.26582: WORKER PROCESS EXITING 15494 1726853336.26614: no more pending results, returning what we have 15494 1726853336.26619: in VariableManager get_vars() 15494 1726853336.26646: Calling all_inventory to load vars for managed_node1 15494 1726853336.26649: Calling groups_inventory to load vars for managed_node1 15494 1726853336.26651: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853336.26660: Calling all_plugins_play to load vars for managed_node1 15494 1726853336.26663: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853336.26665: Calling groups_plugins_play to load vars for managed_node1 15494 1726853336.26784: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853336.26894: done with get_vars() 15494 1726853336.26899: variable 'ansible_search_path' from source: unknown 15494 1726853336.26908: we have included files to process 15494 1726853336.26908: generating all_blocks data 15494 1726853336.26910: done generating all_blocks data 15494 1726853336.26913: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15494 1726853336.26913: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15494 1726853336.26915: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15494 1726853336.27015: in VariableManager get_vars() 15494 1726853336.27025: done with get_vars() 15494 1726853336.27131: done processing included file 15494 1726853336.27133: iterating over new_blocks loaded from include file 15494 1726853336.27134: in VariableManager get_vars() 15494 1726853336.27141: done with get_vars() 15494 1726853336.27142: filtering new block on tags 15494 1726853336.27153: done filtering new block on tags 15494 1726853336.27156: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 15494 1726853336.27159: extending task lists for all hosts with included blocks 15494 1726853336.27243: done extending task lists 15494 1726853336.27244: done processing included files 15494 1726853336.27245: results queue empty 15494 1726853336.27245: checking for any_errors_fatal 15494 1726853336.27248: done checking for any_errors_fatal 15494 1726853336.27248: checking for max_fail_percentage 15494 1726853336.27249: done checking for max_fail_percentage 15494 1726853336.27249: checking to see if all hosts have failed and the running result is not ok 15494 1726853336.27250: done checking to see if all hosts have failed 15494 1726853336.27250: getting the remaining hosts for this loop 15494 1726853336.27251: done getting the remaining hosts for this loop 15494 1726853336.27252: getting the next task for host managed_node1 15494 1726853336.27255: done getting next task for host managed_node1 15494 1726853336.27256: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15494 1726853336.27258: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853336.27260: getting variables 15494 1726853336.27260: in VariableManager get_vars() 15494 1726853336.27267: Calling all_inventory to load vars for managed_node1 15494 1726853336.27269: Calling groups_inventory to load vars for managed_node1 15494 1726853336.27272: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853336.27276: Calling all_plugins_play to load vars for managed_node1 15494 1726853336.27277: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853336.27279: Calling groups_plugins_play to load vars for managed_node1 15494 1726853336.27353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853336.27458: done with get_vars() 15494 1726853336.27464: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 13:28:56 -0400 (0:00:00.017) 0:00:04.890 ****** 15494 1726853336.27513: entering _queue_task() for managed_node1/include_tasks 15494 1726853336.27673: worker is 1 (out of 1 available) 15494 1726853336.27686: exiting _queue_task() for managed_node1/include_tasks 15494 1726853336.27698: done queuing things up, now waiting for results queue to drain 15494 1726853336.27700: waiting for pending results... 15494 1726853336.27836: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 15494 1726853336.27900: in run() - task 02083763-bbaf-0028-1a50-000000000119 15494 1726853336.27909: variable 'ansible_search_path' from source: unknown 15494 1726853336.27913: variable 'ansible_search_path' from source: unknown 15494 1726853336.27941: calling self._execute() 15494 1726853336.28000: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.28003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.28012: variable 'omit' from source: magic vars 15494 1726853336.28376: variable 'ansible_distribution_major_version' from source: facts 15494 1726853336.28380: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853336.28382: _execute() done 15494 1726853336.28384: dumping result to json 15494 1726853336.28385: done dumping result, returning 15494 1726853336.28388: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-0028-1a50-000000000119] 15494 1726853336.28390: sending task result for task 02083763-bbaf-0028-1a50-000000000119 15494 1726853336.28473: no more pending results, returning what we have 15494 1726853336.28478: in VariableManager get_vars() 15494 1726853336.28510: Calling all_inventory to load vars for managed_node1 15494 1726853336.28513: Calling groups_inventory to load vars for managed_node1 15494 1726853336.28517: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853336.28528: Calling all_plugins_play to load vars for managed_node1 15494 1726853336.28531: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853336.28534: Calling groups_plugins_play to load vars for managed_node1 15494 1726853336.28910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853336.29098: done with get_vars() 15494 1726853336.29104: variable 'ansible_search_path' from source: unknown 15494 1726853336.29106: variable 'ansible_search_path' from source: unknown 15494 1726853336.29128: done sending task result for task 02083763-bbaf-0028-1a50-000000000119 15494 1726853336.29131: WORKER PROCESS EXITING 15494 1726853336.29149: we have included files to process 15494 1726853336.29150: generating all_blocks data 15494 1726853336.29152: done generating all_blocks data 15494 1726853336.29153: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15494 1726853336.29154: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15494 1726853336.29156: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15494 1726853336.29375: done processing included file 15494 1726853336.29377: iterating over new_blocks loaded from include file 15494 1726853336.29379: in VariableManager get_vars() 15494 1726853336.29389: done with get_vars() 15494 1726853336.29391: filtering new block on tags 15494 1726853336.29405: done filtering new block on tags 15494 1726853336.29407: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 15494 1726853336.29411: extending task lists for all hosts with included blocks 15494 1726853336.29514: done extending task lists 15494 1726853336.29515: done processing included files 15494 1726853336.29516: results queue empty 15494 1726853336.29516: checking for any_errors_fatal 15494 1726853336.29519: done checking for any_errors_fatal 15494 1726853336.29519: checking for max_fail_percentage 15494 1726853336.29520: done checking for max_fail_percentage 15494 1726853336.29521: checking to see if all hosts have failed and the running result is not ok 15494 1726853336.29522: done checking to see if all hosts have failed 15494 1726853336.29522: getting the remaining hosts for this loop 15494 1726853336.29523: done getting the remaining hosts for this loop 15494 1726853336.29526: getting the next task for host managed_node1 15494 1726853336.29539: done getting next task for host managed_node1 15494 1726853336.29541: ^ task is: TASK: Get stat for interface {{ interface }} 15494 1726853336.29544: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853336.29546: getting variables 15494 1726853336.29547: in VariableManager get_vars() 15494 1726853336.29554: Calling all_inventory to load vars for managed_node1 15494 1726853336.29556: Calling groups_inventory to load vars for managed_node1 15494 1726853336.29558: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853336.29562: Calling all_plugins_play to load vars for managed_node1 15494 1726853336.29565: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853336.29567: Calling groups_plugins_play to load vars for managed_node1 15494 1726853336.29700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853336.29903: done with get_vars() 15494 1726853336.29912: done getting variables 15494 1726853336.30068: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:28:56 -0400 (0:00:00.025) 0:00:04.916 ****** 15494 1726853336.30104: entering _queue_task() for managed_node1/stat 15494 1726853336.30332: worker is 1 (out of 1 available) 15494 1726853336.30345: exiting _queue_task() for managed_node1/stat 15494 1726853336.30357: done queuing things up, now waiting for results queue to drain 15494 1726853336.30358: waiting for pending results... 15494 1726853336.30638: running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 15494 1726853336.30745: in run() - task 02083763-bbaf-0028-1a50-000000000133 15494 1726853336.30778: variable 'ansible_search_path' from source: unknown 15494 1726853336.30781: variable 'ansible_search_path' from source: unknown 15494 1726853336.30846: calling self._execute() 15494 1726853336.30899: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.30910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.30925: variable 'omit' from source: magic vars 15494 1726853336.31634: variable 'ansible_distribution_major_version' from source: facts 15494 1726853336.31717: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853336.31720: variable 'omit' from source: magic vars 15494 1726853336.31723: variable 'omit' from source: magic vars 15494 1726853336.31806: variable 'interface' from source: set_fact 15494 1726853336.31835: variable 'omit' from source: magic vars 15494 1726853336.31880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853336.31917: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853336.31947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853336.31969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853336.31989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853336.32023: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853336.32040: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.32155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.32158: Set connection var ansible_connection to ssh 15494 1726853336.32176: Set connection var ansible_pipelining to False 15494 1726853336.32184: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853336.32190: Set connection var ansible_shell_type to sh 15494 1726853336.32197: Set connection var ansible_timeout to 10 15494 1726853336.32206: Set connection var ansible_shell_executable to /bin/sh 15494 1726853336.32231: variable 'ansible_shell_executable' from source: unknown 15494 1726853336.32238: variable 'ansible_connection' from source: unknown 15494 1726853336.32244: variable 'ansible_module_compression' from source: unknown 15494 1726853336.32249: variable 'ansible_shell_type' from source: unknown 15494 1726853336.32255: variable 'ansible_shell_executable' from source: unknown 15494 1726853336.32264: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.32278: variable 'ansible_pipelining' from source: unknown 15494 1726853336.32284: variable 'ansible_timeout' from source: unknown 15494 1726853336.32290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.32547: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853336.32585: variable 'omit' from source: magic vars 15494 1726853336.32592: starting attempt loop 15494 1726853336.32599: running the handler 15494 1726853336.32605: _low_level_execute_command(): starting 15494 1726853336.32677: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853336.33799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853336.33912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853336.34029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853336.34068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853336.35742: stdout chunk (state=3): >>>/root <<< 15494 1726853336.35939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853336.35943: stdout chunk (state=3): >>><<< 15494 1726853336.35945: stderr chunk (state=3): >>><<< 15494 1726853336.35948: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853336.35991: _low_level_execute_command(): starting 15494 1726853336.35995: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853336.3595178-15782-83535751231242 `" && echo ansible-tmp-1726853336.3595178-15782-83535751231242="` echo /root/.ansible/tmp/ansible-tmp-1726853336.3595178-15782-83535751231242 `" ) && sleep 0' 15494 1726853336.37188: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853336.37383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853336.37387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853336.37391: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853336.37394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.37404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853336.37407: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 15494 1726853336.37409: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15494 1726853336.37411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853336.37414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853336.37417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853336.37418: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853336.37420: stderr chunk (state=3): >>>debug2: match found <<< 15494 1726853336.37422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.37649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853336.37653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853336.37655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853336.39487: stdout chunk (state=3): >>>ansible-tmp-1726853336.3595178-15782-83535751231242=/root/.ansible/tmp/ansible-tmp-1726853336.3595178-15782-83535751231242 <<< 15494 1726853336.39623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853336.39626: stdout chunk (state=3): >>><<< 15494 1726853336.39634: stderr chunk (state=3): >>><<< 15494 1726853336.39654: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853336.3595178-15782-83535751231242=/root/.ansible/tmp/ansible-tmp-1726853336.3595178-15782-83535751231242 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853336.39812: variable 'ansible_module_compression' from source: unknown 15494 1726853336.39869: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15494 1726853336.39992: variable 'ansible_facts' from source: unknown 15494 1726853336.40096: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853336.3595178-15782-83535751231242/AnsiballZ_stat.py 15494 1726853336.40464: Sending initial data 15494 1726853336.40468: Sent initial data (152 bytes) 15494 1726853336.41748: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.41800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853336.41817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853336.41868: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853336.41960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853336.43492: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853336.43546: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853336.43609: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp1r_yt0o6 /root/.ansible/tmp/ansible-tmp-1726853336.3595178-15782-83535751231242/AnsiballZ_stat.py <<< 15494 1726853336.43621: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853336.3595178-15782-83535751231242/AnsiballZ_stat.py" <<< 15494 1726853336.43663: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp1r_yt0o6" to remote "/root/.ansible/tmp/ansible-tmp-1726853336.3595178-15782-83535751231242/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853336.3595178-15782-83535751231242/AnsiballZ_stat.py" <<< 15494 1726853336.44864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853336.44867: stdout chunk (state=3): >>><<< 15494 1726853336.44869: stderr chunk (state=3): >>><<< 15494 1726853336.44879: done transferring module to remote 15494 1726853336.44882: _low_level_execute_command(): starting 15494 1726853336.44884: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853336.3595178-15782-83535751231242/ /root/.ansible/tmp/ansible-tmp-1726853336.3595178-15782-83535751231242/AnsiballZ_stat.py && sleep 0' 15494 1726853336.45968: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853336.45987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853336.46167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853336.46497: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853336.46718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853336.48504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853336.48508: stdout chunk (state=3): >>><<< 15494 1726853336.48510: stderr chunk (state=3): >>><<< 15494 1726853336.48513: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853336.48515: _low_level_execute_command(): starting 15494 1726853336.48518: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853336.3595178-15782-83535751231242/AnsiballZ_stat.py && sleep 0' 15494 1726853336.50131: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853336.50136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 15494 1726853336.50139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853336.50142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.50525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853336.50546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853336.50668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853336.50778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853336.65905: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15494 1726853336.67334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853336.67338: stdout chunk (state=3): >>><<< 15494 1726853336.67340: stderr chunk (state=3): >>><<< 15494 1726853336.67342: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853336.67345: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853336.3595178-15782-83535751231242/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853336.67347: _low_level_execute_command(): starting 15494 1726853336.67349: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853336.3595178-15782-83535751231242/ > /dev/null 2>&1 && sleep 0' 15494 1726853336.67829: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853336.67832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.67834: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853336.67836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853336.67838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.67887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853336.67890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853336.67932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853336.69825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853336.69878: stderr chunk (state=3): >>><<< 15494 1726853336.69881: stdout chunk (state=3): >>><<< 15494 1726853336.69956: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853336.69959: handler run complete 15494 1726853336.69978: attempt loop complete, returning result 15494 1726853336.69982: _execute() done 15494 1726853336.69984: dumping result to json 15494 1726853336.69986: done dumping result, returning 15494 1726853336.69993: done running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 [02083763-bbaf-0028-1a50-000000000133] 15494 1726853336.69997: sending task result for task 02083763-bbaf-0028-1a50-000000000133 15494 1726853336.70110: done sending task result for task 02083763-bbaf-0028-1a50-000000000133 15494 1726853336.70113: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15494 1726853336.70546: no more pending results, returning what we have 15494 1726853336.70549: results queue empty 15494 1726853336.70550: checking for any_errors_fatal 15494 1726853336.70551: done checking for any_errors_fatal 15494 1726853336.70552: checking for max_fail_percentage 15494 1726853336.70553: done checking for max_fail_percentage 15494 1726853336.70554: checking to see if all hosts have failed and the running result is not ok 15494 1726853336.70555: done checking to see if all hosts have failed 15494 1726853336.70555: getting the remaining hosts for this loop 15494 1726853336.70556: done getting the remaining hosts for this loop 15494 1726853336.70560: getting the next task for host managed_node1 15494 1726853336.70566: done getting next task for host managed_node1 15494 1726853336.70568: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 15494 1726853336.70570: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853336.70574: getting variables 15494 1726853336.70576: in VariableManager get_vars() 15494 1726853336.70597: Calling all_inventory to load vars for managed_node1 15494 1726853336.70600: Calling groups_inventory to load vars for managed_node1 15494 1726853336.70603: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853336.70613: Calling all_plugins_play to load vars for managed_node1 15494 1726853336.70616: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853336.70619: Calling groups_plugins_play to load vars for managed_node1 15494 1726853336.70768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853336.70951: done with get_vars() 15494 1726853336.70959: done getting variables 15494 1726853336.71050: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 15494 1726853336.71162: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 13:28:56 -0400 (0:00:00.410) 0:00:05.327 ****** 15494 1726853336.71190: entering _queue_task() for managed_node1/assert 15494 1726853336.71192: Creating lock for assert 15494 1726853336.71433: worker is 1 (out of 1 available) 15494 1726853336.71447: exiting _queue_task() for managed_node1/assert 15494 1726853336.71459: done queuing things up, now waiting for results queue to drain 15494 1726853336.71460: waiting for pending results... 15494 1726853336.71692: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'LSR-TST-br31' 15494 1726853336.71726: in run() - task 02083763-bbaf-0028-1a50-00000000011a 15494 1726853336.71734: variable 'ansible_search_path' from source: unknown 15494 1726853336.71738: variable 'ansible_search_path' from source: unknown 15494 1726853336.71768: calling self._execute() 15494 1726853336.71831: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.71834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.71843: variable 'omit' from source: magic vars 15494 1726853336.72098: variable 'ansible_distribution_major_version' from source: facts 15494 1726853336.72113: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853336.72116: variable 'omit' from source: magic vars 15494 1726853336.72141: variable 'omit' from source: magic vars 15494 1726853336.72206: variable 'interface' from source: set_fact 15494 1726853336.72220: variable 'omit' from source: magic vars 15494 1726853336.72254: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853336.72282: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853336.72298: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853336.72310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853336.72319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853336.72348: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853336.72352: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.72354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.72417: Set connection var ansible_connection to ssh 15494 1726853336.72421: Set connection var ansible_pipelining to False 15494 1726853336.72426: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853336.72429: Set connection var ansible_shell_type to sh 15494 1726853336.72436: Set connection var ansible_timeout to 10 15494 1726853336.72442: Set connection var ansible_shell_executable to /bin/sh 15494 1726853336.72460: variable 'ansible_shell_executable' from source: unknown 15494 1726853336.72465: variable 'ansible_connection' from source: unknown 15494 1726853336.72467: variable 'ansible_module_compression' from source: unknown 15494 1726853336.72470: variable 'ansible_shell_type' from source: unknown 15494 1726853336.72474: variable 'ansible_shell_executable' from source: unknown 15494 1726853336.72477: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.72479: variable 'ansible_pipelining' from source: unknown 15494 1726853336.72482: variable 'ansible_timeout' from source: unknown 15494 1726853336.72484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.72674: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853336.72678: variable 'omit' from source: magic vars 15494 1726853336.72680: starting attempt loop 15494 1726853336.72683: running the handler 15494 1726853336.72812: variable 'interface_stat' from source: set_fact 15494 1726853336.72815: Evaluated conditional (not interface_stat.stat.exists): True 15494 1726853336.72817: handler run complete 15494 1726853336.72820: attempt loop complete, returning result 15494 1726853336.72822: _execute() done 15494 1726853336.72824: dumping result to json 15494 1726853336.72826: done dumping result, returning 15494 1726853336.72828: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'LSR-TST-br31' [02083763-bbaf-0028-1a50-00000000011a] 15494 1726853336.72830: sending task result for task 02083763-bbaf-0028-1a50-00000000011a 15494 1726853336.72891: done sending task result for task 02083763-bbaf-0028-1a50-00000000011a 15494 1726853336.72894: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15494 1726853336.72961: no more pending results, returning what we have 15494 1726853336.72966: results queue empty 15494 1726853336.72966: checking for any_errors_fatal 15494 1726853336.73134: done checking for any_errors_fatal 15494 1726853336.73136: checking for max_fail_percentage 15494 1726853336.73138: done checking for max_fail_percentage 15494 1726853336.73138: checking to see if all hosts have failed and the running result is not ok 15494 1726853336.73139: done checking to see if all hosts have failed 15494 1726853336.73140: getting the remaining hosts for this loop 15494 1726853336.73141: done getting the remaining hosts for this loop 15494 1726853336.73144: getting the next task for host managed_node1 15494 1726853336.73154: done getting next task for host managed_node1 15494 1726853336.73155: ^ task is: TASK: meta (flush_handlers) 15494 1726853336.73157: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853336.73160: getting variables 15494 1726853336.73161: in VariableManager get_vars() 15494 1726853336.73217: Calling all_inventory to load vars for managed_node1 15494 1726853336.73220: Calling groups_inventory to load vars for managed_node1 15494 1726853336.73223: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853336.73301: Calling all_plugins_play to load vars for managed_node1 15494 1726853336.73306: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853336.73310: Calling groups_plugins_play to load vars for managed_node1 15494 1726853336.73619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853336.73941: done with get_vars() 15494 1726853336.73953: done getting variables 15494 1726853336.74027: in VariableManager get_vars() 15494 1726853336.74068: Calling all_inventory to load vars for managed_node1 15494 1726853336.74072: Calling groups_inventory to load vars for managed_node1 15494 1726853336.74075: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853336.74079: Calling all_plugins_play to load vars for managed_node1 15494 1726853336.74081: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853336.74084: Calling groups_plugins_play to load vars for managed_node1 15494 1726853336.74221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853336.74356: done with get_vars() 15494 1726853336.74365: done queuing things up, now waiting for results queue to drain 15494 1726853336.74366: results queue empty 15494 1726853336.74367: checking for any_errors_fatal 15494 1726853336.74368: done checking for any_errors_fatal 15494 1726853336.74369: checking for max_fail_percentage 15494 1726853336.74370: done checking for max_fail_percentage 15494 1726853336.74372: checking to see if all hosts have failed and the running result is not ok 15494 1726853336.74374: done checking to see if all hosts have failed 15494 1726853336.74378: getting the remaining hosts for this loop 15494 1726853336.74379: done getting the remaining hosts for this loop 15494 1726853336.74380: getting the next task for host managed_node1 15494 1726853336.74383: done getting next task for host managed_node1 15494 1726853336.74384: ^ task is: TASK: meta (flush_handlers) 15494 1726853336.74385: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853336.74386: getting variables 15494 1726853336.74387: in VariableManager get_vars() 15494 1726853336.74392: Calling all_inventory to load vars for managed_node1 15494 1726853336.74393: Calling groups_inventory to load vars for managed_node1 15494 1726853336.74395: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853336.74398: Calling all_plugins_play to load vars for managed_node1 15494 1726853336.74399: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853336.74401: Calling groups_plugins_play to load vars for managed_node1 15494 1726853336.74484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853336.74613: done with get_vars() 15494 1726853336.74618: done getting variables 15494 1726853336.74650: in VariableManager get_vars() 15494 1726853336.74656: Calling all_inventory to load vars for managed_node1 15494 1726853336.74658: Calling groups_inventory to load vars for managed_node1 15494 1726853336.74660: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853336.74663: Calling all_plugins_play to load vars for managed_node1 15494 1726853336.74664: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853336.74666: Calling groups_plugins_play to load vars for managed_node1 15494 1726853336.74741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853336.74854: done with get_vars() 15494 1726853336.74862: done queuing things up, now waiting for results queue to drain 15494 1726853336.74863: results queue empty 15494 1726853336.74864: checking for any_errors_fatal 15494 1726853336.74864: done checking for any_errors_fatal 15494 1726853336.74865: checking for max_fail_percentage 15494 1726853336.74865: done checking for max_fail_percentage 15494 1726853336.74866: checking to see if all hosts have failed and the running result is not ok 15494 1726853336.74867: done checking to see if all hosts have failed 15494 1726853336.74867: getting the remaining hosts for this loop 15494 1726853336.74868: done getting the remaining hosts for this loop 15494 1726853336.74870: getting the next task for host managed_node1 15494 1726853336.74874: done getting next task for host managed_node1 15494 1726853336.74875: ^ task is: None 15494 1726853336.74876: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853336.74876: done queuing things up, now waiting for results queue to drain 15494 1726853336.74877: results queue empty 15494 1726853336.74877: checking for any_errors_fatal 15494 1726853336.74878: done checking for any_errors_fatal 15494 1726853336.74878: checking for max_fail_percentage 15494 1726853336.74879: done checking for max_fail_percentage 15494 1726853336.74879: checking to see if all hosts have failed and the running result is not ok 15494 1726853336.74879: done checking to see if all hosts have failed 15494 1726853336.74881: getting the next task for host managed_node1 15494 1726853336.74882: done getting next task for host managed_node1 15494 1726853336.74882: ^ task is: None 15494 1726853336.74883: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853336.74912: in VariableManager get_vars() 15494 1726853336.74928: done with get_vars() 15494 1726853336.74931: in VariableManager get_vars() 15494 1726853336.74938: done with get_vars() 15494 1726853336.74941: variable 'omit' from source: magic vars 15494 1726853336.74961: in VariableManager get_vars() 15494 1726853336.74969: done with get_vars() 15494 1726853336.74985: variable 'omit' from source: magic vars PLAY [Add test bridge] ********************************************************* 15494 1726853336.75355: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15494 1726853336.75373: getting the remaining hosts for this loop 15494 1726853336.75374: done getting the remaining hosts for this loop 15494 1726853336.75376: getting the next task for host managed_node1 15494 1726853336.75378: done getting next task for host managed_node1 15494 1726853336.75379: ^ task is: TASK: Gathering Facts 15494 1726853336.75380: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853336.75381: getting variables 15494 1726853336.75382: in VariableManager get_vars() 15494 1726853336.75388: Calling all_inventory to load vars for managed_node1 15494 1726853336.75389: Calling groups_inventory to load vars for managed_node1 15494 1726853336.75391: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853336.75393: Calling all_plugins_play to load vars for managed_node1 15494 1726853336.75395: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853336.75396: Calling groups_plugins_play to load vars for managed_node1 15494 1726853336.75479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853336.75585: done with get_vars() 15494 1726853336.75591: done getting variables 15494 1726853336.75615: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Friday 20 September 2024 13:28:56 -0400 (0:00:00.044) 0:00:05.372 ****** 15494 1726853336.75632: entering _queue_task() for managed_node1/gather_facts 15494 1726853336.75819: worker is 1 (out of 1 available) 15494 1726853336.75832: exiting _queue_task() for managed_node1/gather_facts 15494 1726853336.75843: done queuing things up, now waiting for results queue to drain 15494 1726853336.75845: waiting for pending results... 15494 1726853336.75994: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15494 1726853336.76049: in run() - task 02083763-bbaf-0028-1a50-00000000014c 15494 1726853336.76059: variable 'ansible_search_path' from source: unknown 15494 1726853336.76093: calling self._execute() 15494 1726853336.76152: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.76155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.76163: variable 'omit' from source: magic vars 15494 1726853336.76426: variable 'ansible_distribution_major_version' from source: facts 15494 1726853336.76435: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853336.76441: variable 'omit' from source: magic vars 15494 1726853336.76461: variable 'omit' from source: magic vars 15494 1726853336.76487: variable 'omit' from source: magic vars 15494 1726853336.76519: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853336.76548: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853336.76562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853336.76575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853336.76585: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853336.76610: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853336.76614: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.76618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.76683: Set connection var ansible_connection to ssh 15494 1726853336.76686: Set connection var ansible_pipelining to False 15494 1726853336.76692: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853336.76695: Set connection var ansible_shell_type to sh 15494 1726853336.76699: Set connection var ansible_timeout to 10 15494 1726853336.76706: Set connection var ansible_shell_executable to /bin/sh 15494 1726853336.76723: variable 'ansible_shell_executable' from source: unknown 15494 1726853336.76728: variable 'ansible_connection' from source: unknown 15494 1726853336.76730: variable 'ansible_module_compression' from source: unknown 15494 1726853336.76733: variable 'ansible_shell_type' from source: unknown 15494 1726853336.76735: variable 'ansible_shell_executable' from source: unknown 15494 1726853336.76738: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853336.76740: variable 'ansible_pipelining' from source: unknown 15494 1726853336.76742: variable 'ansible_timeout' from source: unknown 15494 1726853336.76744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853336.76872: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853336.76881: variable 'omit' from source: magic vars 15494 1726853336.76884: starting attempt loop 15494 1726853336.76887: running the handler 15494 1726853336.76900: variable 'ansible_facts' from source: unknown 15494 1726853336.76914: _low_level_execute_command(): starting 15494 1726853336.76921: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853336.77544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853336.77548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.77551: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853336.77553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.77606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853336.77609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853336.77681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853336.79336: stdout chunk (state=3): >>>/root <<< 15494 1726853336.79435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853336.79461: stderr chunk (state=3): >>><<< 15494 1726853336.79465: stdout chunk (state=3): >>><<< 15494 1726853336.79487: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853336.79499: _low_level_execute_command(): starting 15494 1726853336.79503: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853336.794866-15817-67152881634296 `" && echo ansible-tmp-1726853336.794866-15817-67152881634296="` echo /root/.ansible/tmp/ansible-tmp-1726853336.794866-15817-67152881634296 `" ) && sleep 0' 15494 1726853336.80032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.80036: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.80085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853336.80111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853336.80148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853336.82021: stdout chunk (state=3): >>>ansible-tmp-1726853336.794866-15817-67152881634296=/root/.ansible/tmp/ansible-tmp-1726853336.794866-15817-67152881634296 <<< 15494 1726853336.82155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853336.82168: stderr chunk (state=3): >>><<< 15494 1726853336.82174: stdout chunk (state=3): >>><<< 15494 1726853336.82188: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853336.794866-15817-67152881634296=/root/.ansible/tmp/ansible-tmp-1726853336.794866-15817-67152881634296 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853336.82212: variable 'ansible_module_compression' from source: unknown 15494 1726853336.82253: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15494 1726853336.82309: variable 'ansible_facts' from source: unknown 15494 1726853336.82496: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853336.794866-15817-67152881634296/AnsiballZ_setup.py 15494 1726853336.82665: Sending initial data 15494 1726853336.82668: Sent initial data (152 bytes) 15494 1726853336.83203: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853336.83206: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853336.83214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.83254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853336.83258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853336.83261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853336.83294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853336.84806: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15494 1726853336.84813: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853336.84844: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853336.84887: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpyw9t6v8i /root/.ansible/tmp/ansible-tmp-1726853336.794866-15817-67152881634296/AnsiballZ_setup.py <<< 15494 1726853336.84891: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853336.794866-15817-67152881634296/AnsiballZ_setup.py" <<< 15494 1726853336.84935: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpyw9t6v8i" to remote "/root/.ansible/tmp/ansible-tmp-1726853336.794866-15817-67152881634296/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853336.794866-15817-67152881634296/AnsiballZ_setup.py" <<< 15494 1726853336.85961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853336.85999: stderr chunk (state=3): >>><<< 15494 1726853336.86002: stdout chunk (state=3): >>><<< 15494 1726853336.86017: done transferring module to remote 15494 1726853336.86028: _low_level_execute_command(): starting 15494 1726853336.86031: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853336.794866-15817-67152881634296/ /root/.ansible/tmp/ansible-tmp-1726853336.794866-15817-67152881634296/AnsiballZ_setup.py && sleep 0' 15494 1726853336.86466: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853336.86470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853336.86474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.86477: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853336.86479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.86531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853336.86533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853336.86568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853336.88284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853336.88311: stderr chunk (state=3): >>><<< 15494 1726853336.88313: stdout chunk (state=3): >>><<< 15494 1726853336.88326: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853336.88329: _low_level_execute_command(): starting 15494 1726853336.88331: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853336.794866-15817-67152881634296/AnsiballZ_setup.py && sleep 0' 15494 1726853336.88759: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853336.88762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.88764: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853336.88766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853336.88768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853336.88823: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853336.88830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853336.88831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853336.88873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853337.52494: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2968, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 563, "free": 2968}, "nocache": {"free": 3305, "used": 226}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 503, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797347328, "block_size": 4096, "block_total": 65519099, "block_available": 63915368, "block_used": 1603731, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.57763671875, "5m": 0.341796875, "15m": 0.14794921875}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "28", "second": "57", "epoch": "1726853337", "epoch_int": "1726853337", "date": "2024-09-20", "time": "13:28:57", "iso8601_micro": "2024-09-20T17:28:57.519985Z", "iso8601": "2024-09-20T17:28:57Z", "iso8601_basic": "20240920T132857519985", "iso8601_basic_short": "20240920T132857", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15494 1726853337.54558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853337.54562: stdout chunk (state=3): >>><<< 15494 1726853337.54567: stderr chunk (state=3): >>><<< 15494 1726853337.54613: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2968, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 563, "free": 2968}, "nocache": {"free": 3305, "used": 226}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 503, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797347328, "block_size": 4096, "block_total": 65519099, "block_available": 63915368, "block_used": 1603731, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.57763671875, "5m": 0.341796875, "15m": 0.14794921875}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "28", "second": "57", "epoch": "1726853337", "epoch_int": "1726853337", "date": "2024-09-20", "time": "13:28:57", "iso8601_micro": "2024-09-20T17:28:57.519985Z", "iso8601": "2024-09-20T17:28:57Z", "iso8601_basic": "20240920T132857519985", "iso8601_basic_short": "20240920T132857", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853337.55588: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853336.794866-15817-67152881634296/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853337.55611: _low_level_execute_command(): starting 15494 1726853337.55616: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853336.794866-15817-67152881634296/ > /dev/null 2>&1 && sleep 0' 15494 1726853337.56901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853337.56914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853337.56924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853337.57013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853337.57231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853337.57251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853337.59106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853337.59109: stdout chunk (state=3): >>><<< 15494 1726853337.59111: stderr chunk (state=3): >>><<< 15494 1726853337.59276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853337.59279: handler run complete 15494 1726853337.59281: variable 'ansible_facts' from source: unknown 15494 1726853337.59376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853337.59696: variable 'ansible_facts' from source: unknown 15494 1726853337.59783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853337.59934: attempt loop complete, returning result 15494 1726853337.59943: _execute() done 15494 1726853337.59952: dumping result to json 15494 1726853337.59991: done dumping result, returning 15494 1726853337.60003: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-0028-1a50-00000000014c] 15494 1726853337.60011: sending task result for task 02083763-bbaf-0028-1a50-00000000014c ok: [managed_node1] 15494 1726853337.60727: no more pending results, returning what we have 15494 1726853337.60730: results queue empty 15494 1726853337.60731: checking for any_errors_fatal 15494 1726853337.60732: done checking for any_errors_fatal 15494 1726853337.60733: checking for max_fail_percentage 15494 1726853337.60734: done checking for max_fail_percentage 15494 1726853337.60735: checking to see if all hosts have failed and the running result is not ok 15494 1726853337.60736: done checking to see if all hosts have failed 15494 1726853337.60736: getting the remaining hosts for this loop 15494 1726853337.60738: done getting the remaining hosts for this loop 15494 1726853337.60740: getting the next task for host managed_node1 15494 1726853337.60745: done getting next task for host managed_node1 15494 1726853337.60749: ^ task is: TASK: meta (flush_handlers) 15494 1726853337.60751: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853337.60753: getting variables 15494 1726853337.60755: in VariableManager get_vars() 15494 1726853337.60787: Calling all_inventory to load vars for managed_node1 15494 1726853337.60790: Calling groups_inventory to load vars for managed_node1 15494 1726853337.60791: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853337.60797: done sending task result for task 02083763-bbaf-0028-1a50-00000000014c 15494 1726853337.60799: WORKER PROCESS EXITING 15494 1726853337.60808: Calling all_plugins_play to load vars for managed_node1 15494 1726853337.60810: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853337.60812: Calling groups_plugins_play to load vars for managed_node1 15494 1726853337.60968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853337.61200: done with get_vars() 15494 1726853337.61219: done getting variables 15494 1726853337.61292: in VariableManager get_vars() 15494 1726853337.61304: Calling all_inventory to load vars for managed_node1 15494 1726853337.61306: Calling groups_inventory to load vars for managed_node1 15494 1726853337.61308: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853337.61313: Calling all_plugins_play to load vars for managed_node1 15494 1726853337.61322: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853337.61326: Calling groups_plugins_play to load vars for managed_node1 15494 1726853337.61485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853337.61923: done with get_vars() 15494 1726853337.61935: done queuing things up, now waiting for results queue to drain 15494 1726853337.61937: results queue empty 15494 1726853337.61938: checking for any_errors_fatal 15494 1726853337.61941: done checking for any_errors_fatal 15494 1726853337.61941: checking for max_fail_percentage 15494 1726853337.61942: done checking for max_fail_percentage 15494 1726853337.61943: checking to see if all hosts have failed and the running result is not ok 15494 1726853337.61950: done checking to see if all hosts have failed 15494 1726853337.61951: getting the remaining hosts for this loop 15494 1726853337.61952: done getting the remaining hosts for this loop 15494 1726853337.61954: getting the next task for host managed_node1 15494 1726853337.61958: done getting next task for host managed_node1 15494 1726853337.61960: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15494 1726853337.61962: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853337.62082: getting variables 15494 1726853337.62083: in VariableManager get_vars() 15494 1726853337.62096: Calling all_inventory to load vars for managed_node1 15494 1726853337.62098: Calling groups_inventory to load vars for managed_node1 15494 1726853337.62100: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853337.62104: Calling all_plugins_play to load vars for managed_node1 15494 1726853337.62106: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853337.62108: Calling groups_plugins_play to load vars for managed_node1 15494 1726853337.62334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853337.62781: done with get_vars() 15494 1726853337.62788: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:28:57 -0400 (0:00:00.872) 0:00:06.244 ****** 15494 1726853337.62906: entering _queue_task() for managed_node1/include_tasks 15494 1726853337.64009: worker is 1 (out of 1 available) 15494 1726853337.64017: exiting _queue_task() for managed_node1/include_tasks 15494 1726853337.64029: done queuing things up, now waiting for results queue to drain 15494 1726853337.64031: waiting for pending results... 15494 1726853337.64058: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15494 1726853337.64135: in run() - task 02083763-bbaf-0028-1a50-000000000014 15494 1726853337.64177: variable 'ansible_search_path' from source: unknown 15494 1726853337.64180: variable 'ansible_search_path' from source: unknown 15494 1726853337.64388: calling self._execute() 15494 1726853337.64475: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853337.64479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853337.64489: variable 'omit' from source: magic vars 15494 1726853337.65232: variable 'ansible_distribution_major_version' from source: facts 15494 1726853337.65476: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853337.65480: _execute() done 15494 1726853337.65483: dumping result to json 15494 1726853337.65486: done dumping result, returning 15494 1726853337.65489: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-0028-1a50-000000000014] 15494 1726853337.65492: sending task result for task 02083763-bbaf-0028-1a50-000000000014 15494 1726853337.65565: done sending task result for task 02083763-bbaf-0028-1a50-000000000014 15494 1726853337.65569: WORKER PROCESS EXITING 15494 1726853337.65612: no more pending results, returning what we have 15494 1726853337.65618: in VariableManager get_vars() 15494 1726853337.65776: Calling all_inventory to load vars for managed_node1 15494 1726853337.65780: Calling groups_inventory to load vars for managed_node1 15494 1726853337.65782: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853337.65792: Calling all_plugins_play to load vars for managed_node1 15494 1726853337.65794: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853337.65797: Calling groups_plugins_play to load vars for managed_node1 15494 1726853337.66190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853337.66515: done with get_vars() 15494 1726853337.66523: variable 'ansible_search_path' from source: unknown 15494 1726853337.66526: variable 'ansible_search_path' from source: unknown 15494 1726853337.66555: we have included files to process 15494 1726853337.66556: generating all_blocks data 15494 1726853337.66557: done generating all_blocks data 15494 1726853337.66558: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15494 1726853337.66559: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15494 1726853337.66561: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15494 1726853337.68207: done processing included file 15494 1726853337.68210: iterating over new_blocks loaded from include file 15494 1726853337.68211: in VariableManager get_vars() 15494 1726853337.68232: done with get_vars() 15494 1726853337.68234: filtering new block on tags 15494 1726853337.68252: done filtering new block on tags 15494 1726853337.68255: in VariableManager get_vars() 15494 1726853337.68423: done with get_vars() 15494 1726853337.68425: filtering new block on tags 15494 1726853337.68444: done filtering new block on tags 15494 1726853337.68450: in VariableManager get_vars() 15494 1726853337.68469: done with get_vars() 15494 1726853337.68473: filtering new block on tags 15494 1726853337.68489: done filtering new block on tags 15494 1726853337.68492: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 15494 1726853337.68497: extending task lists for all hosts with included blocks 15494 1726853337.69257: done extending task lists 15494 1726853337.69259: done processing included files 15494 1726853337.69259: results queue empty 15494 1726853337.69260: checking for any_errors_fatal 15494 1726853337.69261: done checking for any_errors_fatal 15494 1726853337.69262: checking for max_fail_percentage 15494 1726853337.69263: done checking for max_fail_percentage 15494 1726853337.69263: checking to see if all hosts have failed and the running result is not ok 15494 1726853337.69264: done checking to see if all hosts have failed 15494 1726853337.69265: getting the remaining hosts for this loop 15494 1726853337.69266: done getting the remaining hosts for this loop 15494 1726853337.69268: getting the next task for host managed_node1 15494 1726853337.69274: done getting next task for host managed_node1 15494 1726853337.69276: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15494 1726853337.69278: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853337.69381: getting variables 15494 1726853337.69382: in VariableManager get_vars() 15494 1726853337.69401: Calling all_inventory to load vars for managed_node1 15494 1726853337.69404: Calling groups_inventory to load vars for managed_node1 15494 1726853337.69406: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853337.69411: Calling all_plugins_play to load vars for managed_node1 15494 1726853337.69413: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853337.69416: Calling groups_plugins_play to load vars for managed_node1 15494 1726853337.69690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853337.69897: done with get_vars() 15494 1726853337.69907: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:28:57 -0400 (0:00:00.070) 0:00:06.315 ****** 15494 1726853337.69987: entering _queue_task() for managed_node1/setup 15494 1726853337.70443: worker is 1 (out of 1 available) 15494 1726853337.70458: exiting _queue_task() for managed_node1/setup 15494 1726853337.70476: done queuing things up, now waiting for results queue to drain 15494 1726853337.70478: waiting for pending results... 15494 1726853337.70736: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15494 1726853337.70868: in run() - task 02083763-bbaf-0028-1a50-00000000018d 15494 1726853337.70978: variable 'ansible_search_path' from source: unknown 15494 1726853337.70986: variable 'ansible_search_path' from source: unknown 15494 1726853337.71033: calling self._execute() 15494 1726853337.71121: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853337.71203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853337.71207: variable 'omit' from source: magic vars 15494 1726853337.71543: variable 'ansible_distribution_major_version' from source: facts 15494 1726853337.71564: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853337.71857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853337.74331: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853337.74416: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853337.74466: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853337.74504: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853337.74542: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853337.74635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853337.74672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853337.74699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853337.74742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853337.74761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853337.74820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853337.74887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853337.74912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853337.74964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853337.74983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853337.75183: variable '__network_required_facts' from source: role '' defaults 15494 1726853337.75186: variable 'ansible_facts' from source: unknown 15494 1726853337.75381: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15494 1726853337.75384: when evaluation is False, skipping this task 15494 1726853337.75386: _execute() done 15494 1726853337.75388: dumping result to json 15494 1726853337.75390: done dumping result, returning 15494 1726853337.75392: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-0028-1a50-00000000018d] 15494 1726853337.75401: sending task result for task 02083763-bbaf-0028-1a50-00000000018d skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15494 1726853337.75597: no more pending results, returning what we have 15494 1726853337.75602: results queue empty 15494 1726853337.75603: checking for any_errors_fatal 15494 1726853337.75605: done checking for any_errors_fatal 15494 1726853337.75608: checking for max_fail_percentage 15494 1726853337.75610: done checking for max_fail_percentage 15494 1726853337.75611: checking to see if all hosts have failed and the running result is not ok 15494 1726853337.75612: done checking to see if all hosts have failed 15494 1726853337.75613: getting the remaining hosts for this loop 15494 1726853337.75615: done getting the remaining hosts for this loop 15494 1726853337.75619: getting the next task for host managed_node1 15494 1726853337.75628: done getting next task for host managed_node1 15494 1726853337.75632: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15494 1726853337.75634: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853337.75655: getting variables 15494 1726853337.75657: in VariableManager get_vars() 15494 1726853337.75696: Calling all_inventory to load vars for managed_node1 15494 1726853337.75698: Calling groups_inventory to load vars for managed_node1 15494 1726853337.75701: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853337.75710: Calling all_plugins_play to load vars for managed_node1 15494 1726853337.75712: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853337.75714: Calling groups_plugins_play to load vars for managed_node1 15494 1726853337.75921: done sending task result for task 02083763-bbaf-0028-1a50-00000000018d 15494 1726853337.75925: WORKER PROCESS EXITING 15494 1726853337.75949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853337.76172: done with get_vars() 15494 1726853337.76185: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:28:57 -0400 (0:00:00.063) 0:00:06.378 ****** 15494 1726853337.76290: entering _queue_task() for managed_node1/stat 15494 1726853337.76572: worker is 1 (out of 1 available) 15494 1726853337.76586: exiting _queue_task() for managed_node1/stat 15494 1726853337.76599: done queuing things up, now waiting for results queue to drain 15494 1726853337.76600: waiting for pending results... 15494 1726853337.76824: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15494 1726853337.77026: in run() - task 02083763-bbaf-0028-1a50-00000000018f 15494 1726853337.77031: variable 'ansible_search_path' from source: unknown 15494 1726853337.77033: variable 'ansible_search_path' from source: unknown 15494 1726853337.77036: calling self._execute() 15494 1726853337.77153: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853337.77157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853337.77160: variable 'omit' from source: magic vars 15494 1726853337.77636: variable 'ansible_distribution_major_version' from source: facts 15494 1726853337.77640: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853337.77720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853337.78127: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853337.78227: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853337.78269: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853337.78339: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853337.78661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853337.78879: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853337.78883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853337.78896: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853337.79084: variable '__network_is_ostree' from source: set_fact 15494 1726853337.79102: Evaluated conditional (not __network_is_ostree is defined): False 15494 1726853337.79109: when evaluation is False, skipping this task 15494 1726853337.79116: _execute() done 15494 1726853337.79132: dumping result to json 15494 1726853337.79141: done dumping result, returning 15494 1726853337.79157: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-0028-1a50-00000000018f] 15494 1726853337.79167: sending task result for task 02083763-bbaf-0028-1a50-00000000018f 15494 1726853337.79535: done sending task result for task 02083763-bbaf-0028-1a50-00000000018f 15494 1726853337.79538: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15494 1726853337.79599: no more pending results, returning what we have 15494 1726853337.79603: results queue empty 15494 1726853337.79605: checking for any_errors_fatal 15494 1726853337.79608: done checking for any_errors_fatal 15494 1726853337.79609: checking for max_fail_percentage 15494 1726853337.79611: done checking for max_fail_percentage 15494 1726853337.79612: checking to see if all hosts have failed and the running result is not ok 15494 1726853337.79613: done checking to see if all hosts have failed 15494 1726853337.79614: getting the remaining hosts for this loop 15494 1726853337.79616: done getting the remaining hosts for this loop 15494 1726853337.79625: getting the next task for host managed_node1 15494 1726853337.79632: done getting next task for host managed_node1 15494 1726853337.79640: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15494 1726853337.79644: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853337.79660: getting variables 15494 1726853337.79662: in VariableManager get_vars() 15494 1726853337.79705: Calling all_inventory to load vars for managed_node1 15494 1726853337.79708: Calling groups_inventory to load vars for managed_node1 15494 1726853337.79710: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853337.79721: Calling all_plugins_play to load vars for managed_node1 15494 1726853337.79724: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853337.79866: Calling groups_plugins_play to load vars for managed_node1 15494 1726853337.80179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853337.80468: done with get_vars() 15494 1726853337.80481: done getting variables 15494 1726853337.80578: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:28:57 -0400 (0:00:00.043) 0:00:06.421 ****** 15494 1726853337.80611: entering _queue_task() for managed_node1/set_fact 15494 1726853337.81196: worker is 1 (out of 1 available) 15494 1726853337.81206: exiting _queue_task() for managed_node1/set_fact 15494 1726853337.81217: done queuing things up, now waiting for results queue to drain 15494 1726853337.81218: waiting for pending results... 15494 1726853337.81352: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15494 1726853337.81444: in run() - task 02083763-bbaf-0028-1a50-000000000190 15494 1726853337.81474: variable 'ansible_search_path' from source: unknown 15494 1726853337.81483: variable 'ansible_search_path' from source: unknown 15494 1726853337.81557: calling self._execute() 15494 1726853337.81620: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853337.81632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853337.81648: variable 'omit' from source: magic vars 15494 1726853337.82053: variable 'ansible_distribution_major_version' from source: facts 15494 1726853337.82098: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853337.82254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853337.82557: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853337.82608: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853337.82756: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853337.82759: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853337.82797: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853337.82829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853337.82876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853337.82907: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853337.83011: variable '__network_is_ostree' from source: set_fact 15494 1726853337.83024: Evaluated conditional (not __network_is_ostree is defined): False 15494 1726853337.83032: when evaluation is False, skipping this task 15494 1726853337.83039: _execute() done 15494 1726853337.83049: dumping result to json 15494 1726853337.83057: done dumping result, returning 15494 1726853337.83070: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-0028-1a50-000000000190] 15494 1726853337.83144: sending task result for task 02083763-bbaf-0028-1a50-000000000190 15494 1726853337.83612: done sending task result for task 02083763-bbaf-0028-1a50-000000000190 15494 1726853337.83615: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15494 1726853337.83668: no more pending results, returning what we have 15494 1726853337.83675: results queue empty 15494 1726853337.83676: checking for any_errors_fatal 15494 1726853337.83730: done checking for any_errors_fatal 15494 1726853337.83731: checking for max_fail_percentage 15494 1726853337.83734: done checking for max_fail_percentage 15494 1726853337.83735: checking to see if all hosts have failed and the running result is not ok 15494 1726853337.83736: done checking to see if all hosts have failed 15494 1726853337.83736: getting the remaining hosts for this loop 15494 1726853337.83738: done getting the remaining hosts for this loop 15494 1726853337.83742: getting the next task for host managed_node1 15494 1726853337.83754: done getting next task for host managed_node1 15494 1726853337.83758: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15494 1726853337.83761: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853337.83777: getting variables 15494 1726853337.83780: in VariableManager get_vars() 15494 1726853337.83999: Calling all_inventory to load vars for managed_node1 15494 1726853337.84002: Calling groups_inventory to load vars for managed_node1 15494 1726853337.84004: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853337.84129: Calling all_plugins_play to load vars for managed_node1 15494 1726853337.84134: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853337.84137: Calling groups_plugins_play to load vars for managed_node1 15494 1726853337.84327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853337.84963: done with get_vars() 15494 1726853337.84977: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:28:57 -0400 (0:00:00.044) 0:00:06.466 ****** 15494 1726853337.85136: entering _queue_task() for managed_node1/service_facts 15494 1726853337.85138: Creating lock for service_facts 15494 1726853337.85956: worker is 1 (out of 1 available) 15494 1726853337.85970: exiting _queue_task() for managed_node1/service_facts 15494 1726853337.85986: done queuing things up, now waiting for results queue to drain 15494 1726853337.85988: waiting for pending results... 15494 1726853337.86690: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 15494 1726853337.86698: in run() - task 02083763-bbaf-0028-1a50-000000000192 15494 1726853337.86701: variable 'ansible_search_path' from source: unknown 15494 1726853337.86705: variable 'ansible_search_path' from source: unknown 15494 1726853337.86708: calling self._execute() 15494 1726853337.86763: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853337.86789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853337.86806: variable 'omit' from source: magic vars 15494 1726853337.87237: variable 'ansible_distribution_major_version' from source: facts 15494 1726853337.87258: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853337.87274: variable 'omit' from source: magic vars 15494 1726853337.87349: variable 'omit' from source: magic vars 15494 1726853337.87424: variable 'omit' from source: magic vars 15494 1726853337.87448: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853337.87491: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853337.87603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853337.87641: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853337.87657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853337.87693: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853337.87752: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853337.87755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853337.87827: Set connection var ansible_connection to ssh 15494 1726853337.87840: Set connection var ansible_pipelining to False 15494 1726853337.87860: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853337.87875: Set connection var ansible_shell_type to sh 15494 1726853337.87887: Set connection var ansible_timeout to 10 15494 1726853337.87899: Set connection var ansible_shell_executable to /bin/sh 15494 1726853337.87928: variable 'ansible_shell_executable' from source: unknown 15494 1726853337.87967: variable 'ansible_connection' from source: unknown 15494 1726853337.87976: variable 'ansible_module_compression' from source: unknown 15494 1726853337.87979: variable 'ansible_shell_type' from source: unknown 15494 1726853337.87981: variable 'ansible_shell_executable' from source: unknown 15494 1726853337.87983: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853337.87985: variable 'ansible_pipelining' from source: unknown 15494 1726853337.87987: variable 'ansible_timeout' from source: unknown 15494 1726853337.87989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853337.88277: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853337.88282: variable 'omit' from source: magic vars 15494 1726853337.88284: starting attempt loop 15494 1726853337.88292: running the handler 15494 1726853337.88295: _low_level_execute_command(): starting 15494 1726853337.88299: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853337.89203: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853337.89209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853337.89231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853337.89265: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853337.89339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853337.91044: stdout chunk (state=3): >>>/root <<< 15494 1726853337.91190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853337.91194: stdout chunk (state=3): >>><<< 15494 1726853337.91197: stderr chunk (state=3): >>><<< 15494 1726853337.91214: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853337.91231: _low_level_execute_command(): starting 15494 1726853337.91313: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853337.9122062-15876-210913954341577 `" && echo ansible-tmp-1726853337.9122062-15876-210913954341577="` echo /root/.ansible/tmp/ansible-tmp-1726853337.9122062-15876-210913954341577 `" ) && sleep 0' 15494 1726853337.91789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853337.91795: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853337.91816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853337.91877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853337.91895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853337.91959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853337.93837: stdout chunk (state=3): >>>ansible-tmp-1726853337.9122062-15876-210913954341577=/root/.ansible/tmp/ansible-tmp-1726853337.9122062-15876-210913954341577 <<< 15494 1726853337.93974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853337.93978: stderr chunk (state=3): >>><<< 15494 1726853337.93986: stdout chunk (state=3): >>><<< 15494 1726853337.94000: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853337.9122062-15876-210913954341577=/root/.ansible/tmp/ansible-tmp-1726853337.9122062-15876-210913954341577 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853337.94054: variable 'ansible_module_compression' from source: unknown 15494 1726853337.94111: ANSIBALLZ: Using lock for service_facts 15494 1726853337.94114: ANSIBALLZ: Acquiring lock 15494 1726853337.94116: ANSIBALLZ: Lock acquired: 140002370384240 15494 1726853337.94123: ANSIBALLZ: Creating module 15494 1726853338.06432: ANSIBALLZ: Writing module into payload 15494 1726853338.06498: ANSIBALLZ: Writing module 15494 1726853338.06516: ANSIBALLZ: Renaming module 15494 1726853338.06527: ANSIBALLZ: Done creating module 15494 1726853338.06543: variable 'ansible_facts' from source: unknown 15494 1726853338.06602: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853337.9122062-15876-210913954341577/AnsiballZ_service_facts.py 15494 1726853338.06703: Sending initial data 15494 1726853338.06707: Sent initial data (162 bytes) 15494 1726853338.07143: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853338.07180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853338.07184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853338.07187: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853338.07189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853338.07192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853338.07239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853338.07242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853338.07245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853338.07293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853338.08929: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853338.08968: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853338.09006: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpv0mujibi /root/.ansible/tmp/ansible-tmp-1726853337.9122062-15876-210913954341577/AnsiballZ_service_facts.py <<< 15494 1726853338.09010: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853337.9122062-15876-210913954341577/AnsiballZ_service_facts.py" <<< 15494 1726853338.09050: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpv0mujibi" to remote "/root/.ansible/tmp/ansible-tmp-1726853337.9122062-15876-210913954341577/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853337.9122062-15876-210913954341577/AnsiballZ_service_facts.py" <<< 15494 1726853338.09583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853338.09662: stderr chunk (state=3): >>><<< 15494 1726853338.09664: stdout chunk (state=3): >>><<< 15494 1726853338.09672: done transferring module to remote 15494 1726853338.09682: _low_level_execute_command(): starting 15494 1726853338.09687: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853337.9122062-15876-210913954341577/ /root/.ansible/tmp/ansible-tmp-1726853337.9122062-15876-210913954341577/AnsiballZ_service_facts.py && sleep 0' 15494 1726853338.10120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853338.10123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853338.10125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853338.10127: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853338.10135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853338.10190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853338.10192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853338.10229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853338.11999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853338.12038: stderr chunk (state=3): >>><<< 15494 1726853338.12042: stdout chunk (state=3): >>><<< 15494 1726853338.12061: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853338.12153: _low_level_execute_command(): starting 15494 1726853338.12158: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853337.9122062-15876-210913954341577/AnsiballZ_service_facts.py && sleep 0' 15494 1726853338.13034: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853338.13049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853338.13092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853338.13096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853338.13145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853338.13193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853338.13197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853338.13210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853338.13264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853339.66398: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15494 1726853339.68204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853339.68208: stdout chunk (state=3): >>><<< 15494 1726853339.68211: stderr chunk (state=3): >>><<< 15494 1726853339.68215: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853339.69345: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853337.9122062-15876-210913954341577/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853339.69661: _low_level_execute_command(): starting 15494 1726853339.69665: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853337.9122062-15876-210913954341577/ > /dev/null 2>&1 && sleep 0' 15494 1726853339.70540: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853339.70579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853339.70699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853339.70742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853339.70765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853339.70809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853339.70851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853339.72886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853339.72890: stdout chunk (state=3): >>><<< 15494 1726853339.72892: stderr chunk (state=3): >>><<< 15494 1726853339.72895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853339.72897: handler run complete 15494 1726853339.73239: variable 'ansible_facts' from source: unknown 15494 1726853339.73435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853339.74278: variable 'ansible_facts' from source: unknown 15494 1726853339.74395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853339.74834: attempt loop complete, returning result 15494 1726853339.74854: _execute() done 15494 1726853339.74880: dumping result to json 15494 1726853339.75018: done dumping result, returning 15494 1726853339.75065: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-0028-1a50-000000000192] 15494 1726853339.75094: sending task result for task 02083763-bbaf-0028-1a50-000000000192 15494 1726853339.76425: done sending task result for task 02083763-bbaf-0028-1a50-000000000192 15494 1726853339.76428: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15494 1726853339.76528: no more pending results, returning what we have 15494 1726853339.76531: results queue empty 15494 1726853339.76532: checking for any_errors_fatal 15494 1726853339.76534: done checking for any_errors_fatal 15494 1726853339.76535: checking for max_fail_percentage 15494 1726853339.76536: done checking for max_fail_percentage 15494 1726853339.76537: checking to see if all hosts have failed and the running result is not ok 15494 1726853339.76538: done checking to see if all hosts have failed 15494 1726853339.76538: getting the remaining hosts for this loop 15494 1726853339.76540: done getting the remaining hosts for this loop 15494 1726853339.76543: getting the next task for host managed_node1 15494 1726853339.76547: done getting next task for host managed_node1 15494 1726853339.76551: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15494 1726853339.76553: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853339.76562: getting variables 15494 1726853339.76563: in VariableManager get_vars() 15494 1726853339.76595: Calling all_inventory to load vars for managed_node1 15494 1726853339.76598: Calling groups_inventory to load vars for managed_node1 15494 1726853339.76600: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853339.76608: Calling all_plugins_play to load vars for managed_node1 15494 1726853339.76611: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853339.76614: Calling groups_plugins_play to load vars for managed_node1 15494 1726853339.76987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853339.78026: done with get_vars() 15494 1726853339.78039: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:28:59 -0400 (0:00:01.930) 0:00:08.398 ****** 15494 1726853339.78224: entering _queue_task() for managed_node1/package_facts 15494 1726853339.78226: Creating lock for package_facts 15494 1726853339.78522: worker is 1 (out of 1 available) 15494 1726853339.78533: exiting _queue_task() for managed_node1/package_facts 15494 1726853339.78543: done queuing things up, now waiting for results queue to drain 15494 1726853339.78544: waiting for pending results... 15494 1726853339.78886: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15494 1726853339.78890: in run() - task 02083763-bbaf-0028-1a50-000000000193 15494 1726853339.78894: variable 'ansible_search_path' from source: unknown 15494 1726853339.78896: variable 'ansible_search_path' from source: unknown 15494 1726853339.78911: calling self._execute() 15494 1726853339.79000: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853339.79011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853339.79026: variable 'omit' from source: magic vars 15494 1726853339.79390: variable 'ansible_distribution_major_version' from source: facts 15494 1726853339.79407: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853339.79419: variable 'omit' from source: magic vars 15494 1726853339.79487: variable 'omit' from source: magic vars 15494 1726853339.79526: variable 'omit' from source: magic vars 15494 1726853339.79568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853339.79610: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853339.79633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853339.79655: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853339.79754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853339.79795: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853339.79806: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853339.79813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853339.79947: Set connection var ansible_connection to ssh 15494 1726853339.79978: Set connection var ansible_pipelining to False 15494 1726853339.79981: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853339.79983: Set connection var ansible_shell_type to sh 15494 1726853339.79985: Set connection var ansible_timeout to 10 15494 1726853339.79997: Set connection var ansible_shell_executable to /bin/sh 15494 1726853339.80067: variable 'ansible_shell_executable' from source: unknown 15494 1726853339.80070: variable 'ansible_connection' from source: unknown 15494 1726853339.80075: variable 'ansible_module_compression' from source: unknown 15494 1726853339.80077: variable 'ansible_shell_type' from source: unknown 15494 1726853339.80160: variable 'ansible_shell_executable' from source: unknown 15494 1726853339.80163: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853339.80166: variable 'ansible_pipelining' from source: unknown 15494 1726853339.80168: variable 'ansible_timeout' from source: unknown 15494 1726853339.80170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853339.80550: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853339.80573: variable 'omit' from source: magic vars 15494 1726853339.80583: starting attempt loop 15494 1726853339.80590: running the handler 15494 1726853339.80608: _low_level_execute_command(): starting 15494 1726853339.80621: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853339.81393: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853339.81453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853339.81504: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853339.81556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853339.83196: stdout chunk (state=3): >>>/root <<< 15494 1726853339.83450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853339.83455: stdout chunk (state=3): >>><<< 15494 1726853339.83459: stderr chunk (state=3): >>><<< 15494 1726853339.83462: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853339.83465: _low_level_execute_command(): starting 15494 1726853339.83468: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853339.8335845-15977-82045185017220 `" && echo ansible-tmp-1726853339.8335845-15977-82045185017220="` echo /root/.ansible/tmp/ansible-tmp-1726853339.8335845-15977-82045185017220 `" ) && sleep 0' 15494 1726853339.83916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853339.83919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853339.83922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853339.83924: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853339.83934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853339.83974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853339.83978: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853339.83991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853339.84041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853339.85911: stdout chunk (state=3): >>>ansible-tmp-1726853339.8335845-15977-82045185017220=/root/.ansible/tmp/ansible-tmp-1726853339.8335845-15977-82045185017220 <<< 15494 1726853339.86055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853339.86059: stdout chunk (state=3): >>><<< 15494 1726853339.86062: stderr chunk (state=3): >>><<< 15494 1726853339.86086: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853339.8335845-15977-82045185017220=/root/.ansible/tmp/ansible-tmp-1726853339.8335845-15977-82045185017220 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853339.86137: variable 'ansible_module_compression' from source: unknown 15494 1726853339.86205: ANSIBALLZ: Using lock for package_facts 15494 1726853339.86208: ANSIBALLZ: Acquiring lock 15494 1726853339.86210: ANSIBALLZ: Lock acquired: 140002368276288 15494 1726853339.86212: ANSIBALLZ: Creating module 15494 1726853340.11849: ANSIBALLZ: Writing module into payload 15494 1726853340.11955: ANSIBALLZ: Writing module 15494 1726853340.11978: ANSIBALLZ: Renaming module 15494 1726853340.11998: ANSIBALLZ: Done creating module 15494 1726853340.12019: variable 'ansible_facts' from source: unknown 15494 1726853340.12210: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853339.8335845-15977-82045185017220/AnsiballZ_package_facts.py 15494 1726853340.12480: Sending initial data 15494 1726853340.12483: Sent initial data (161 bytes) 15494 1726853340.12936: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853340.13038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853340.13056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853340.13131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853340.14759: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15494 1726853340.14872: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15494 1726853340.14907: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853340.14911: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853340.15015: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp0byz238u /root/.ansible/tmp/ansible-tmp-1726853339.8335845-15977-82045185017220/AnsiballZ_package_facts.py <<< 15494 1726853340.15019: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853339.8335845-15977-82045185017220/AnsiballZ_package_facts.py" <<< 15494 1726853340.15103: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp0byz238u" to remote "/root/.ansible/tmp/ansible-tmp-1726853339.8335845-15977-82045185017220/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853339.8335845-15977-82045185017220/AnsiballZ_package_facts.py" <<< 15494 1726853340.17528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853340.17566: stderr chunk (state=3): >>><<< 15494 1726853340.17604: stdout chunk (state=3): >>><<< 15494 1726853340.17607: done transferring module to remote 15494 1726853340.17621: _low_level_execute_command(): starting 15494 1726853340.17630: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853339.8335845-15977-82045185017220/ /root/.ansible/tmp/ansible-tmp-1726853339.8335845-15977-82045185017220/AnsiballZ_package_facts.py && sleep 0' 15494 1726853340.18578: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853340.18582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853340.18585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853340.18590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853340.18593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853340.18595: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853340.18596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853340.18598: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853340.18600: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 15494 1726853340.18606: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15494 1726853340.18609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853340.18611: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853340.18613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853340.18614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853340.20341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853340.20390: stderr chunk (state=3): >>><<< 15494 1726853340.20393: stdout chunk (state=3): >>><<< 15494 1726853340.20412: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853340.20415: _low_level_execute_command(): starting 15494 1726853340.20421: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853339.8335845-15977-82045185017220/AnsiballZ_package_facts.py && sleep 0' 15494 1726853340.21085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853340.21156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853340.21159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853340.21169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853340.21214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853340.65579: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 15494 1726853340.65604: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 15494 1726853340.65708: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15494 1726853340.67502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853340.67572: stderr chunk (state=3): >>><<< 15494 1726853340.67576: stdout chunk (state=3): >>><<< 15494 1726853340.67739: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853340.73342: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853339.8335845-15977-82045185017220/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853340.73379: _low_level_execute_command(): starting 15494 1726853340.73395: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853339.8335845-15977-82045185017220/ > /dev/null 2>&1 && sleep 0' 15494 1726853340.74033: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853340.74052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853340.74075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853340.74093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853340.74191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853340.74210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853340.74231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853340.74258: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853340.74415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853340.76301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853340.76316: stdout chunk (state=3): >>><<< 15494 1726853340.76379: stderr chunk (state=3): >>><<< 15494 1726853340.76399: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853340.76410: handler run complete 15494 1726853340.78009: variable 'ansible_facts' from source: unknown 15494 1726853340.78441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853340.79866: variable 'ansible_facts' from source: unknown 15494 1726853340.80755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853340.81238: attempt loop complete, returning result 15494 1726853340.81251: _execute() done 15494 1726853340.81254: dumping result to json 15494 1726853340.81374: done dumping result, returning 15494 1726853340.81385: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-0028-1a50-000000000193] 15494 1726853340.81388: sending task result for task 02083763-bbaf-0028-1a50-000000000193 15494 1726853340.82750: done sending task result for task 02083763-bbaf-0028-1a50-000000000193 15494 1726853340.82754: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15494 1726853340.82795: no more pending results, returning what we have 15494 1726853340.82797: results queue empty 15494 1726853340.82798: checking for any_errors_fatal 15494 1726853340.82802: done checking for any_errors_fatal 15494 1726853340.82802: checking for max_fail_percentage 15494 1726853340.82803: done checking for max_fail_percentage 15494 1726853340.82804: checking to see if all hosts have failed and the running result is not ok 15494 1726853340.82804: done checking to see if all hosts have failed 15494 1726853340.82805: getting the remaining hosts for this loop 15494 1726853340.82806: done getting the remaining hosts for this loop 15494 1726853340.82810: getting the next task for host managed_node1 15494 1726853340.82816: done getting next task for host managed_node1 15494 1726853340.82818: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15494 1726853340.82819: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853340.82825: getting variables 15494 1726853340.82826: in VariableManager get_vars() 15494 1726853340.82848: Calling all_inventory to load vars for managed_node1 15494 1726853340.82850: Calling groups_inventory to load vars for managed_node1 15494 1726853340.82851: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853340.82857: Calling all_plugins_play to load vars for managed_node1 15494 1726853340.82859: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853340.82861: Calling groups_plugins_play to load vars for managed_node1 15494 1726853340.83584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853340.84454: done with get_vars() 15494 1726853340.84474: done getting variables 15494 1726853340.84517: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:29:00 -0400 (0:00:01.063) 0:00:09.461 ****** 15494 1726853340.84539: entering _queue_task() for managed_node1/debug 15494 1726853340.84786: worker is 1 (out of 1 available) 15494 1726853340.84800: exiting _queue_task() for managed_node1/debug 15494 1726853340.84812: done queuing things up, now waiting for results queue to drain 15494 1726853340.84814: waiting for pending results... 15494 1726853340.84983: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 15494 1726853340.85050: in run() - task 02083763-bbaf-0028-1a50-000000000015 15494 1726853340.85064: variable 'ansible_search_path' from source: unknown 15494 1726853340.85067: variable 'ansible_search_path' from source: unknown 15494 1726853340.85102: calling self._execute() 15494 1726853340.85174: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853340.85178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853340.85187: variable 'omit' from source: magic vars 15494 1726853340.85453: variable 'ansible_distribution_major_version' from source: facts 15494 1726853340.85462: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853340.85468: variable 'omit' from source: magic vars 15494 1726853340.85498: variable 'omit' from source: magic vars 15494 1726853340.85566: variable 'network_provider' from source: set_fact 15494 1726853340.85590: variable 'omit' from source: magic vars 15494 1726853340.85620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853340.85646: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853340.85665: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853340.85679: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853340.85690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853340.85714: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853340.85717: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853340.85720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853340.85788: Set connection var ansible_connection to ssh 15494 1726853340.85792: Set connection var ansible_pipelining to False 15494 1726853340.85798: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853340.85802: Set connection var ansible_shell_type to sh 15494 1726853340.85805: Set connection var ansible_timeout to 10 15494 1726853340.85814: Set connection var ansible_shell_executable to /bin/sh 15494 1726853340.85833: variable 'ansible_shell_executable' from source: unknown 15494 1726853340.85836: variable 'ansible_connection' from source: unknown 15494 1726853340.85839: variable 'ansible_module_compression' from source: unknown 15494 1726853340.85841: variable 'ansible_shell_type' from source: unknown 15494 1726853340.85844: variable 'ansible_shell_executable' from source: unknown 15494 1726853340.85846: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853340.85851: variable 'ansible_pipelining' from source: unknown 15494 1726853340.85854: variable 'ansible_timeout' from source: unknown 15494 1726853340.85858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853340.85961: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853340.85969: variable 'omit' from source: magic vars 15494 1726853340.85975: starting attempt loop 15494 1726853340.85978: running the handler 15494 1726853340.86012: handler run complete 15494 1726853340.86024: attempt loop complete, returning result 15494 1726853340.86028: _execute() done 15494 1726853340.86030: dumping result to json 15494 1726853340.86032: done dumping result, returning 15494 1726853340.86042: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-0028-1a50-000000000015] 15494 1726853340.86045: sending task result for task 02083763-bbaf-0028-1a50-000000000015 15494 1726853340.86123: done sending task result for task 02083763-bbaf-0028-1a50-000000000015 15494 1726853340.86125: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 15494 1726853340.86196: no more pending results, returning what we have 15494 1726853340.86200: results queue empty 15494 1726853340.86200: checking for any_errors_fatal 15494 1726853340.86209: done checking for any_errors_fatal 15494 1726853340.86209: checking for max_fail_percentage 15494 1726853340.86211: done checking for max_fail_percentage 15494 1726853340.86212: checking to see if all hosts have failed and the running result is not ok 15494 1726853340.86213: done checking to see if all hosts have failed 15494 1726853340.86213: getting the remaining hosts for this loop 15494 1726853340.86215: done getting the remaining hosts for this loop 15494 1726853340.86218: getting the next task for host managed_node1 15494 1726853340.86225: done getting next task for host managed_node1 15494 1726853340.86228: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15494 1726853340.86229: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853340.86238: getting variables 15494 1726853340.86240: in VariableManager get_vars() 15494 1726853340.86276: Calling all_inventory to load vars for managed_node1 15494 1726853340.86278: Calling groups_inventory to load vars for managed_node1 15494 1726853340.86280: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853340.86288: Calling all_plugins_play to load vars for managed_node1 15494 1726853340.86291: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853340.86293: Calling groups_plugins_play to load vars for managed_node1 15494 1726853340.87048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853340.87921: done with get_vars() 15494 1726853340.87939: done getting variables 15494 1726853340.88010: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:29:00 -0400 (0:00:00.034) 0:00:09.496 ****** 15494 1726853340.88033: entering _queue_task() for managed_node1/fail 15494 1726853340.88034: Creating lock for fail 15494 1726853340.88277: worker is 1 (out of 1 available) 15494 1726853340.88291: exiting _queue_task() for managed_node1/fail 15494 1726853340.88302: done queuing things up, now waiting for results queue to drain 15494 1726853340.88303: waiting for pending results... 15494 1726853340.88466: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15494 1726853340.88533: in run() - task 02083763-bbaf-0028-1a50-000000000016 15494 1726853340.88544: variable 'ansible_search_path' from source: unknown 15494 1726853340.88549: variable 'ansible_search_path' from source: unknown 15494 1726853340.88580: calling self._execute() 15494 1726853340.88651: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853340.88655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853340.88665: variable 'omit' from source: magic vars 15494 1726853340.88928: variable 'ansible_distribution_major_version' from source: facts 15494 1726853340.88936: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853340.89022: variable 'network_state' from source: role '' defaults 15494 1726853340.89032: Evaluated conditional (network_state != {}): False 15494 1726853340.89035: when evaluation is False, skipping this task 15494 1726853340.89037: _execute() done 15494 1726853340.89040: dumping result to json 15494 1726853340.89042: done dumping result, returning 15494 1726853340.89051: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-0028-1a50-000000000016] 15494 1726853340.89057: sending task result for task 02083763-bbaf-0028-1a50-000000000016 15494 1726853340.89136: done sending task result for task 02083763-bbaf-0028-1a50-000000000016 15494 1726853340.89139: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15494 1726853340.89215: no more pending results, returning what we have 15494 1726853340.89219: results queue empty 15494 1726853340.89220: checking for any_errors_fatal 15494 1726853340.89226: done checking for any_errors_fatal 15494 1726853340.89227: checking for max_fail_percentage 15494 1726853340.89228: done checking for max_fail_percentage 15494 1726853340.89229: checking to see if all hosts have failed and the running result is not ok 15494 1726853340.89230: done checking to see if all hosts have failed 15494 1726853340.89230: getting the remaining hosts for this loop 15494 1726853340.89232: done getting the remaining hosts for this loop 15494 1726853340.89235: getting the next task for host managed_node1 15494 1726853340.89240: done getting next task for host managed_node1 15494 1726853340.89244: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15494 1726853340.89246: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853340.89262: getting variables 15494 1726853340.89263: in VariableManager get_vars() 15494 1726853340.89293: Calling all_inventory to load vars for managed_node1 15494 1726853340.89295: Calling groups_inventory to load vars for managed_node1 15494 1726853340.89297: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853340.89305: Calling all_plugins_play to load vars for managed_node1 15494 1726853340.89307: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853340.89310: Calling groups_plugins_play to load vars for managed_node1 15494 1726853340.90154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853340.91018: done with get_vars() 15494 1726853340.91036: done getting variables 15494 1726853340.91082: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:29:00 -0400 (0:00:00.030) 0:00:09.526 ****** 15494 1726853340.91104: entering _queue_task() for managed_node1/fail 15494 1726853340.91334: worker is 1 (out of 1 available) 15494 1726853340.91348: exiting _queue_task() for managed_node1/fail 15494 1726853340.91360: done queuing things up, now waiting for results queue to drain 15494 1726853340.91361: waiting for pending results... 15494 1726853340.91523: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15494 1726853340.91592: in run() - task 02083763-bbaf-0028-1a50-000000000017 15494 1726853340.91600: variable 'ansible_search_path' from source: unknown 15494 1726853340.91603: variable 'ansible_search_path' from source: unknown 15494 1726853340.91629: calling self._execute() 15494 1726853340.91702: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853340.91706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853340.91715: variable 'omit' from source: magic vars 15494 1726853340.91976: variable 'ansible_distribution_major_version' from source: facts 15494 1726853340.91985: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853340.92069: variable 'network_state' from source: role '' defaults 15494 1726853340.92078: Evaluated conditional (network_state != {}): False 15494 1726853340.92081: when evaluation is False, skipping this task 15494 1726853340.92084: _execute() done 15494 1726853340.92086: dumping result to json 15494 1726853340.92089: done dumping result, returning 15494 1726853340.92097: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-0028-1a50-000000000017] 15494 1726853340.92102: sending task result for task 02083763-bbaf-0028-1a50-000000000017 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15494 1726853340.92231: no more pending results, returning what we have 15494 1726853340.92234: results queue empty 15494 1726853340.92235: checking for any_errors_fatal 15494 1726853340.92242: done checking for any_errors_fatal 15494 1726853340.92243: checking for max_fail_percentage 15494 1726853340.92244: done checking for max_fail_percentage 15494 1726853340.92245: checking to see if all hosts have failed and the running result is not ok 15494 1726853340.92246: done checking to see if all hosts have failed 15494 1726853340.92247: getting the remaining hosts for this loop 15494 1726853340.92248: done getting the remaining hosts for this loop 15494 1726853340.92252: getting the next task for host managed_node1 15494 1726853340.92258: done getting next task for host managed_node1 15494 1726853340.92261: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15494 1726853340.92263: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853340.92281: getting variables 15494 1726853340.92283: in VariableManager get_vars() 15494 1726853340.92313: Calling all_inventory to load vars for managed_node1 15494 1726853340.92316: Calling groups_inventory to load vars for managed_node1 15494 1726853340.92318: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853340.92325: Calling all_plugins_play to load vars for managed_node1 15494 1726853340.92328: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853340.92330: Calling groups_plugins_play to load vars for managed_node1 15494 1726853340.92884: done sending task result for task 02083763-bbaf-0028-1a50-000000000017 15494 1726853340.92888: WORKER PROCESS EXITING 15494 1726853340.93087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853340.93960: done with get_vars() 15494 1726853340.93980: done getting variables 15494 1726853340.94024: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:29:00 -0400 (0:00:00.029) 0:00:09.556 ****** 15494 1726853340.94045: entering _queue_task() for managed_node1/fail 15494 1726853340.94279: worker is 1 (out of 1 available) 15494 1726853340.94293: exiting _queue_task() for managed_node1/fail 15494 1726853340.94306: done queuing things up, now waiting for results queue to drain 15494 1726853340.94307: waiting for pending results... 15494 1726853340.94478: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15494 1726853340.94544: in run() - task 02083763-bbaf-0028-1a50-000000000018 15494 1726853340.94558: variable 'ansible_search_path' from source: unknown 15494 1726853340.94562: variable 'ansible_search_path' from source: unknown 15494 1726853340.94591: calling self._execute() 15494 1726853340.94662: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853340.94666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853340.94673: variable 'omit' from source: magic vars 15494 1726853340.94937: variable 'ansible_distribution_major_version' from source: facts 15494 1726853340.94946: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853340.95067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853340.96782: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853340.96841: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853340.96869: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853340.96895: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853340.96915: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853340.96975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853340.96996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853340.97014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853340.97043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853340.97055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853340.97121: variable 'ansible_distribution_major_version' from source: facts 15494 1726853340.97137: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15494 1726853340.97215: variable 'ansible_distribution' from source: facts 15494 1726853340.97219: variable '__network_rh_distros' from source: role '' defaults 15494 1726853340.97226: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15494 1726853340.97386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853340.97402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853340.97418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853340.97444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853340.97455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853340.97492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853340.97508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853340.97523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853340.97549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853340.97558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853340.97591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853340.97608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853340.97623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853340.97649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853340.97658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853340.97845: variable 'network_connections' from source: play vars 15494 1726853340.97855: variable 'interface' from source: set_fact 15494 1726853340.97907: variable 'interface' from source: set_fact 15494 1726853340.97917: variable 'interface' from source: set_fact 15494 1726853340.97957: variable 'interface' from source: set_fact 15494 1726853340.97965: variable 'network_state' from source: role '' defaults 15494 1726853340.98012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853340.98122: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853340.98151: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853340.98189: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853340.98209: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853340.98329: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853340.98339: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853340.98344: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853340.98349: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853340.98352: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15494 1726853340.98355: when evaluation is False, skipping this task 15494 1726853340.98357: _execute() done 15494 1726853340.98359: dumping result to json 15494 1726853340.98361: done dumping result, returning 15494 1726853340.98364: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-0028-1a50-000000000018] 15494 1726853340.98366: sending task result for task 02083763-bbaf-0028-1a50-000000000018 15494 1726853340.98434: done sending task result for task 02083763-bbaf-0028-1a50-000000000018 15494 1726853340.98437: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15494 1726853340.98489: no more pending results, returning what we have 15494 1726853340.98493: results queue empty 15494 1726853340.98494: checking for any_errors_fatal 15494 1726853340.98499: done checking for any_errors_fatal 15494 1726853340.98499: checking for max_fail_percentage 15494 1726853340.98501: done checking for max_fail_percentage 15494 1726853340.98502: checking to see if all hosts have failed and the running result is not ok 15494 1726853340.98502: done checking to see if all hosts have failed 15494 1726853340.98503: getting the remaining hosts for this loop 15494 1726853340.98505: done getting the remaining hosts for this loop 15494 1726853340.98508: getting the next task for host managed_node1 15494 1726853340.98514: done getting next task for host managed_node1 15494 1726853340.98518: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15494 1726853340.98519: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853340.98532: getting variables 15494 1726853340.98534: in VariableManager get_vars() 15494 1726853340.98572: Calling all_inventory to load vars for managed_node1 15494 1726853340.98575: Calling groups_inventory to load vars for managed_node1 15494 1726853340.98577: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853340.98586: Calling all_plugins_play to load vars for managed_node1 15494 1726853340.98595: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853340.98598: Calling groups_plugins_play to load vars for managed_node1 15494 1726853340.99468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853341.00338: done with get_vars() 15494 1726853341.00358: done getting variables 15494 1726853341.00431: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:29:01 -0400 (0:00:00.064) 0:00:09.620 ****** 15494 1726853341.00457: entering _queue_task() for managed_node1/dnf 15494 1726853341.00698: worker is 1 (out of 1 available) 15494 1726853341.00712: exiting _queue_task() for managed_node1/dnf 15494 1726853341.00725: done queuing things up, now waiting for results queue to drain 15494 1726853341.00727: waiting for pending results... 15494 1726853341.00890: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15494 1726853341.00958: in run() - task 02083763-bbaf-0028-1a50-000000000019 15494 1726853341.00963: variable 'ansible_search_path' from source: unknown 15494 1726853341.00966: variable 'ansible_search_path' from source: unknown 15494 1726853341.00998: calling self._execute() 15494 1726853341.01067: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853341.01070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853341.01083: variable 'omit' from source: magic vars 15494 1726853341.01341: variable 'ansible_distribution_major_version' from source: facts 15494 1726853341.01387: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853341.01485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853341.02960: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853341.03005: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853341.03034: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853341.03060: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853341.03081: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853341.03140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.03161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.03179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.03204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.03214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.03299: variable 'ansible_distribution' from source: facts 15494 1726853341.03302: variable 'ansible_distribution_major_version' from source: facts 15494 1726853341.03314: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15494 1726853341.03394: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853341.03478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.03494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.03510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.03534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.03545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.03577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.03593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.03608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.03631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.03642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.03673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.03689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.03705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.03728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.03738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.03840: variable 'network_connections' from source: play vars 15494 1726853341.03851: variable 'interface' from source: set_fact 15494 1726853341.03901: variable 'interface' from source: set_fact 15494 1726853341.03908: variable 'interface' from source: set_fact 15494 1726853341.03951: variable 'interface' from source: set_fact 15494 1726853341.04001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853341.04122: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853341.04151: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853341.04172: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853341.04193: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853341.04227: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853341.04242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853341.04264: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.04283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853341.04327: variable '__network_team_connections_defined' from source: role '' defaults 15494 1726853341.04474: variable 'network_connections' from source: play vars 15494 1726853341.04477: variable 'interface' from source: set_fact 15494 1726853341.04519: variable 'interface' from source: set_fact 15494 1726853341.04531: variable 'interface' from source: set_fact 15494 1726853341.04569: variable 'interface' from source: set_fact 15494 1726853341.04594: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15494 1726853341.04597: when evaluation is False, skipping this task 15494 1726853341.04599: _execute() done 15494 1726853341.04602: dumping result to json 15494 1726853341.04604: done dumping result, returning 15494 1726853341.04612: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-0028-1a50-000000000019] 15494 1726853341.04616: sending task result for task 02083763-bbaf-0028-1a50-000000000019 15494 1726853341.04710: done sending task result for task 02083763-bbaf-0028-1a50-000000000019 15494 1726853341.04712: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15494 1726853341.04795: no more pending results, returning what we have 15494 1726853341.04799: results queue empty 15494 1726853341.04799: checking for any_errors_fatal 15494 1726853341.04804: done checking for any_errors_fatal 15494 1726853341.04804: checking for max_fail_percentage 15494 1726853341.04806: done checking for max_fail_percentage 15494 1726853341.04807: checking to see if all hosts have failed and the running result is not ok 15494 1726853341.04807: done checking to see if all hosts have failed 15494 1726853341.04808: getting the remaining hosts for this loop 15494 1726853341.04810: done getting the remaining hosts for this loop 15494 1726853341.04813: getting the next task for host managed_node1 15494 1726853341.04818: done getting next task for host managed_node1 15494 1726853341.04822: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15494 1726853341.04824: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853341.04836: getting variables 15494 1726853341.04837: in VariableManager get_vars() 15494 1726853341.04872: Calling all_inventory to load vars for managed_node1 15494 1726853341.04875: Calling groups_inventory to load vars for managed_node1 15494 1726853341.04877: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853341.04885: Calling all_plugins_play to load vars for managed_node1 15494 1726853341.04887: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853341.04896: Calling groups_plugins_play to load vars for managed_node1 15494 1726853341.05675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853341.06659: done with get_vars() 15494 1726853341.06678: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15494 1726853341.06732: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:29:01 -0400 (0:00:00.062) 0:00:09.683 ****** 15494 1726853341.06757: entering _queue_task() for managed_node1/yum 15494 1726853341.06758: Creating lock for yum 15494 1726853341.07006: worker is 1 (out of 1 available) 15494 1726853341.07019: exiting _queue_task() for managed_node1/yum 15494 1726853341.07032: done queuing things up, now waiting for results queue to drain 15494 1726853341.07033: waiting for pending results... 15494 1726853341.07199: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15494 1726853341.07260: in run() - task 02083763-bbaf-0028-1a50-00000000001a 15494 1726853341.07272: variable 'ansible_search_path' from source: unknown 15494 1726853341.07277: variable 'ansible_search_path' from source: unknown 15494 1726853341.07306: calling self._execute() 15494 1726853341.07373: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853341.07376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853341.07387: variable 'omit' from source: magic vars 15494 1726853341.07643: variable 'ansible_distribution_major_version' from source: facts 15494 1726853341.07653: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853341.07767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853341.09241: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853341.09286: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853341.09313: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853341.09341: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853341.09360: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853341.09417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.09442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.09460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.09488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.09500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.09567: variable 'ansible_distribution_major_version' from source: facts 15494 1726853341.09582: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15494 1726853341.09585: when evaluation is False, skipping this task 15494 1726853341.09588: _execute() done 15494 1726853341.09590: dumping result to json 15494 1726853341.09592: done dumping result, returning 15494 1726853341.09601: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-0028-1a50-00000000001a] 15494 1726853341.09603: sending task result for task 02083763-bbaf-0028-1a50-00000000001a 15494 1726853341.09693: done sending task result for task 02083763-bbaf-0028-1a50-00000000001a 15494 1726853341.09695: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15494 1726853341.09742: no more pending results, returning what we have 15494 1726853341.09745: results queue empty 15494 1726853341.09748: checking for any_errors_fatal 15494 1726853341.09754: done checking for any_errors_fatal 15494 1726853341.09755: checking for max_fail_percentage 15494 1726853341.09757: done checking for max_fail_percentage 15494 1726853341.09758: checking to see if all hosts have failed and the running result is not ok 15494 1726853341.09758: done checking to see if all hosts have failed 15494 1726853341.09759: getting the remaining hosts for this loop 15494 1726853341.09760: done getting the remaining hosts for this loop 15494 1726853341.09764: getting the next task for host managed_node1 15494 1726853341.09772: done getting next task for host managed_node1 15494 1726853341.09776: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15494 1726853341.09778: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853341.09790: getting variables 15494 1726853341.09791: in VariableManager get_vars() 15494 1726853341.09828: Calling all_inventory to load vars for managed_node1 15494 1726853341.09831: Calling groups_inventory to load vars for managed_node1 15494 1726853341.09833: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853341.09842: Calling all_plugins_play to load vars for managed_node1 15494 1726853341.09844: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853341.09849: Calling groups_plugins_play to load vars for managed_node1 15494 1726853341.10627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853341.11494: done with get_vars() 15494 1726853341.11513: done getting variables 15494 1726853341.11558: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:29:01 -0400 (0:00:00.048) 0:00:09.731 ****** 15494 1726853341.11582: entering _queue_task() for managed_node1/fail 15494 1726853341.11821: worker is 1 (out of 1 available) 15494 1726853341.11835: exiting _queue_task() for managed_node1/fail 15494 1726853341.11850: done queuing things up, now waiting for results queue to drain 15494 1726853341.11851: waiting for pending results... 15494 1726853341.12017: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15494 1726853341.12090: in run() - task 02083763-bbaf-0028-1a50-00000000001b 15494 1726853341.12101: variable 'ansible_search_path' from source: unknown 15494 1726853341.12104: variable 'ansible_search_path' from source: unknown 15494 1726853341.12133: calling self._execute() 15494 1726853341.12204: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853341.12208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853341.12216: variable 'omit' from source: magic vars 15494 1726853341.12482: variable 'ansible_distribution_major_version' from source: facts 15494 1726853341.12491: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853341.12575: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853341.12702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853341.14163: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853341.14206: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853341.14234: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853341.14262: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853341.14284: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853341.14341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.14364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.14389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.14414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.14424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.14458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.14481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.14498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.14522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.14532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.14561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.14581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.14598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.14622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.14633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.14748: variable 'network_connections' from source: play vars 15494 1726853341.14757: variable 'interface' from source: set_fact 15494 1726853341.14814: variable 'interface' from source: set_fact 15494 1726853341.14821: variable 'interface' from source: set_fact 15494 1726853341.14863: variable 'interface' from source: set_fact 15494 1726853341.14917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853341.15265: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853341.15293: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853341.15314: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853341.15336: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853341.15369: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853341.15386: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853341.15403: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.15419: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853341.15467: variable '__network_team_connections_defined' from source: role '' defaults 15494 1726853341.15615: variable 'network_connections' from source: play vars 15494 1726853341.15618: variable 'interface' from source: set_fact 15494 1726853341.15660: variable 'interface' from source: set_fact 15494 1726853341.15668: variable 'interface' from source: set_fact 15494 1726853341.15710: variable 'interface' from source: set_fact 15494 1726853341.15734: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15494 1726853341.15737: when evaluation is False, skipping this task 15494 1726853341.15739: _execute() done 15494 1726853341.15742: dumping result to json 15494 1726853341.15745: done dumping result, returning 15494 1726853341.15753: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-0028-1a50-00000000001b] 15494 1726853341.15763: sending task result for task 02083763-bbaf-0028-1a50-00000000001b 15494 1726853341.15849: done sending task result for task 02083763-bbaf-0028-1a50-00000000001b 15494 1726853341.15852: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15494 1726853341.15935: no more pending results, returning what we have 15494 1726853341.15938: results queue empty 15494 1726853341.15939: checking for any_errors_fatal 15494 1726853341.15943: done checking for any_errors_fatal 15494 1726853341.15944: checking for max_fail_percentage 15494 1726853341.15945: done checking for max_fail_percentage 15494 1726853341.15948: checking to see if all hosts have failed and the running result is not ok 15494 1726853341.15949: done checking to see if all hosts have failed 15494 1726853341.15950: getting the remaining hosts for this loop 15494 1726853341.15952: done getting the remaining hosts for this loop 15494 1726853341.15955: getting the next task for host managed_node1 15494 1726853341.15962: done getting next task for host managed_node1 15494 1726853341.15966: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15494 1726853341.15968: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853341.15982: getting variables 15494 1726853341.15983: in VariableManager get_vars() 15494 1726853341.16018: Calling all_inventory to load vars for managed_node1 15494 1726853341.16021: Calling groups_inventory to load vars for managed_node1 15494 1726853341.16022: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853341.16031: Calling all_plugins_play to load vars for managed_node1 15494 1726853341.16034: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853341.16036: Calling groups_plugins_play to load vars for managed_node1 15494 1726853341.16937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853341.17793: done with get_vars() 15494 1726853341.17810: done getting variables 15494 1726853341.17855: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:29:01 -0400 (0:00:00.062) 0:00:09.794 ****** 15494 1726853341.17878: entering _queue_task() for managed_node1/package 15494 1726853341.18113: worker is 1 (out of 1 available) 15494 1726853341.18129: exiting _queue_task() for managed_node1/package 15494 1726853341.18141: done queuing things up, now waiting for results queue to drain 15494 1726853341.18142: waiting for pending results... 15494 1726853341.18307: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 15494 1726853341.18381: in run() - task 02083763-bbaf-0028-1a50-00000000001c 15494 1726853341.18392: variable 'ansible_search_path' from source: unknown 15494 1726853341.18396: variable 'ansible_search_path' from source: unknown 15494 1726853341.18423: calling self._execute() 15494 1726853341.18495: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853341.18499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853341.18507: variable 'omit' from source: magic vars 15494 1726853341.18780: variable 'ansible_distribution_major_version' from source: facts 15494 1726853341.18789: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853341.18921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853341.19108: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853341.19141: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853341.19167: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853341.19193: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853341.19270: variable 'network_packages' from source: role '' defaults 15494 1726853341.19338: variable '__network_provider_setup' from source: role '' defaults 15494 1726853341.19353: variable '__network_service_name_default_nm' from source: role '' defaults 15494 1726853341.19402: variable '__network_service_name_default_nm' from source: role '' defaults 15494 1726853341.19409: variable '__network_packages_default_nm' from source: role '' defaults 15494 1726853341.19459: variable '__network_packages_default_nm' from source: role '' defaults 15494 1726853341.19570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853341.20911: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853341.20960: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853341.20990: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853341.21014: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853341.21034: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853341.21097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.21116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.21133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.21161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.21173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.21209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.21225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.21241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.21268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.21280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.21425: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15494 1726853341.21499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.21516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.21535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.21561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.21573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.21631: variable 'ansible_python' from source: facts 15494 1726853341.21659: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15494 1726853341.21714: variable '__network_wpa_supplicant_required' from source: role '' defaults 15494 1726853341.21773: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15494 1726853341.21851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.21877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.21894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.21917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.21928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.21966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.21981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.21999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.22023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.22033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.22130: variable 'network_connections' from source: play vars 15494 1726853341.22135: variable 'interface' from source: set_fact 15494 1726853341.22207: variable 'interface' from source: set_fact 15494 1726853341.22216: variable 'interface' from source: set_fact 15494 1726853341.22289: variable 'interface' from source: set_fact 15494 1726853341.22336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853341.22356: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853341.22378: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.22401: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853341.22435: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853341.22610: variable 'network_connections' from source: play vars 15494 1726853341.22613: variable 'interface' from source: set_fact 15494 1726853341.22685: variable 'interface' from source: set_fact 15494 1726853341.22692: variable 'interface' from source: set_fact 15494 1726853341.22760: variable 'interface' from source: set_fact 15494 1726853341.22798: variable '__network_packages_default_wireless' from source: role '' defaults 15494 1726853341.22853: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853341.23043: variable 'network_connections' from source: play vars 15494 1726853341.23051: variable 'interface' from source: set_fact 15494 1726853341.23097: variable 'interface' from source: set_fact 15494 1726853341.23102: variable 'interface' from source: set_fact 15494 1726853341.23145: variable 'interface' from source: set_fact 15494 1726853341.23169: variable '__network_packages_default_team' from source: role '' defaults 15494 1726853341.23220: variable '__network_team_connections_defined' from source: role '' defaults 15494 1726853341.23409: variable 'network_connections' from source: play vars 15494 1726853341.23412: variable 'interface' from source: set_fact 15494 1726853341.23458: variable 'interface' from source: set_fact 15494 1726853341.23463: variable 'interface' from source: set_fact 15494 1726853341.23511: variable 'interface' from source: set_fact 15494 1726853341.23553: variable '__network_service_name_default_initscripts' from source: role '' defaults 15494 1726853341.23595: variable '__network_service_name_default_initscripts' from source: role '' defaults 15494 1726853341.23600: variable '__network_packages_default_initscripts' from source: role '' defaults 15494 1726853341.23644: variable '__network_packages_default_initscripts' from source: role '' defaults 15494 1726853341.23790: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15494 1726853341.24082: variable 'network_connections' from source: play vars 15494 1726853341.24086: variable 'interface' from source: set_fact 15494 1726853341.24128: variable 'interface' from source: set_fact 15494 1726853341.24134: variable 'interface' from source: set_fact 15494 1726853341.24180: variable 'interface' from source: set_fact 15494 1726853341.24188: variable 'ansible_distribution' from source: facts 15494 1726853341.24190: variable '__network_rh_distros' from source: role '' defaults 15494 1726853341.24196: variable 'ansible_distribution_major_version' from source: facts 15494 1726853341.24218: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15494 1726853341.24325: variable 'ansible_distribution' from source: facts 15494 1726853341.24328: variable '__network_rh_distros' from source: role '' defaults 15494 1726853341.24333: variable 'ansible_distribution_major_version' from source: facts 15494 1726853341.24341: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15494 1726853341.24444: variable 'ansible_distribution' from source: facts 15494 1726853341.24450: variable '__network_rh_distros' from source: role '' defaults 15494 1726853341.24453: variable 'ansible_distribution_major_version' from source: facts 15494 1726853341.24482: variable 'network_provider' from source: set_fact 15494 1726853341.24492: variable 'ansible_facts' from source: unknown 15494 1726853341.24919: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15494 1726853341.24922: when evaluation is False, skipping this task 15494 1726853341.24925: _execute() done 15494 1726853341.24927: dumping result to json 15494 1726853341.24929: done dumping result, returning 15494 1726853341.24932: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-0028-1a50-00000000001c] 15494 1726853341.24934: sending task result for task 02083763-bbaf-0028-1a50-00000000001c 15494 1726853341.25022: done sending task result for task 02083763-bbaf-0028-1a50-00000000001c 15494 1726853341.25026: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15494 1726853341.25091: no more pending results, returning what we have 15494 1726853341.25094: results queue empty 15494 1726853341.25095: checking for any_errors_fatal 15494 1726853341.25100: done checking for any_errors_fatal 15494 1726853341.25101: checking for max_fail_percentage 15494 1726853341.25103: done checking for max_fail_percentage 15494 1726853341.25103: checking to see if all hosts have failed and the running result is not ok 15494 1726853341.25104: done checking to see if all hosts have failed 15494 1726853341.25105: getting the remaining hosts for this loop 15494 1726853341.25106: done getting the remaining hosts for this loop 15494 1726853341.25110: getting the next task for host managed_node1 15494 1726853341.25116: done getting next task for host managed_node1 15494 1726853341.25119: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15494 1726853341.25121: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853341.25133: getting variables 15494 1726853341.25137: in VariableManager get_vars() 15494 1726853341.25174: Calling all_inventory to load vars for managed_node1 15494 1726853341.25177: Calling groups_inventory to load vars for managed_node1 15494 1726853341.25179: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853341.25192: Calling all_plugins_play to load vars for managed_node1 15494 1726853341.25194: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853341.25197: Calling groups_plugins_play to load vars for managed_node1 15494 1726853341.26010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853341.26888: done with get_vars() 15494 1726853341.26909: done getting variables 15494 1726853341.26954: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:29:01 -0400 (0:00:00.090) 0:00:09.885 ****** 15494 1726853341.26978: entering _queue_task() for managed_node1/package 15494 1726853341.27219: worker is 1 (out of 1 available) 15494 1726853341.27234: exiting _queue_task() for managed_node1/package 15494 1726853341.27250: done queuing things up, now waiting for results queue to drain 15494 1726853341.27252: waiting for pending results... 15494 1726853341.27424: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15494 1726853341.27498: in run() - task 02083763-bbaf-0028-1a50-00000000001d 15494 1726853341.27509: variable 'ansible_search_path' from source: unknown 15494 1726853341.27512: variable 'ansible_search_path' from source: unknown 15494 1726853341.27540: calling self._execute() 15494 1726853341.27613: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853341.27617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853341.27627: variable 'omit' from source: magic vars 15494 1726853341.27897: variable 'ansible_distribution_major_version' from source: facts 15494 1726853341.27908: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853341.27990: variable 'network_state' from source: role '' defaults 15494 1726853341.27998: Evaluated conditional (network_state != {}): False 15494 1726853341.28001: when evaluation is False, skipping this task 15494 1726853341.28004: _execute() done 15494 1726853341.28006: dumping result to json 15494 1726853341.28008: done dumping result, returning 15494 1726853341.28018: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-0028-1a50-00000000001d] 15494 1726853341.28021: sending task result for task 02083763-bbaf-0028-1a50-00000000001d 15494 1726853341.28105: done sending task result for task 02083763-bbaf-0028-1a50-00000000001d 15494 1726853341.28108: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15494 1726853341.28174: no more pending results, returning what we have 15494 1726853341.28179: results queue empty 15494 1726853341.28179: checking for any_errors_fatal 15494 1726853341.28184: done checking for any_errors_fatal 15494 1726853341.28185: checking for max_fail_percentage 15494 1726853341.28187: done checking for max_fail_percentage 15494 1726853341.28188: checking to see if all hosts have failed and the running result is not ok 15494 1726853341.28188: done checking to see if all hosts have failed 15494 1726853341.28189: getting the remaining hosts for this loop 15494 1726853341.28190: done getting the remaining hosts for this loop 15494 1726853341.28194: getting the next task for host managed_node1 15494 1726853341.28199: done getting next task for host managed_node1 15494 1726853341.28203: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15494 1726853341.28204: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853341.28218: getting variables 15494 1726853341.28220: in VariableManager get_vars() 15494 1726853341.28249: Calling all_inventory to load vars for managed_node1 15494 1726853341.28251: Calling groups_inventory to load vars for managed_node1 15494 1726853341.28253: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853341.28261: Calling all_plugins_play to load vars for managed_node1 15494 1726853341.28263: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853341.28266: Calling groups_plugins_play to load vars for managed_node1 15494 1726853341.31575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853341.32429: done with get_vars() 15494 1726853341.32447: done getting variables 15494 1726853341.32483: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:29:01 -0400 (0:00:00.055) 0:00:09.940 ****** 15494 1726853341.32502: entering _queue_task() for managed_node1/package 15494 1726853341.32753: worker is 1 (out of 1 available) 15494 1726853341.32766: exiting _queue_task() for managed_node1/package 15494 1726853341.32780: done queuing things up, now waiting for results queue to drain 15494 1726853341.32781: waiting for pending results... 15494 1726853341.32942: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15494 1726853341.33014: in run() - task 02083763-bbaf-0028-1a50-00000000001e 15494 1726853341.33025: variable 'ansible_search_path' from source: unknown 15494 1726853341.33028: variable 'ansible_search_path' from source: unknown 15494 1726853341.33058: calling self._execute() 15494 1726853341.33130: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853341.33135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853341.33144: variable 'omit' from source: magic vars 15494 1726853341.33415: variable 'ansible_distribution_major_version' from source: facts 15494 1726853341.33424: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853341.33509: variable 'network_state' from source: role '' defaults 15494 1726853341.33518: Evaluated conditional (network_state != {}): False 15494 1726853341.33521: when evaluation is False, skipping this task 15494 1726853341.33523: _execute() done 15494 1726853341.33526: dumping result to json 15494 1726853341.33528: done dumping result, returning 15494 1726853341.33535: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-0028-1a50-00000000001e] 15494 1726853341.33540: sending task result for task 02083763-bbaf-0028-1a50-00000000001e 15494 1726853341.33629: done sending task result for task 02083763-bbaf-0028-1a50-00000000001e 15494 1726853341.33631: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15494 1726853341.33697: no more pending results, returning what we have 15494 1726853341.33700: results queue empty 15494 1726853341.33701: checking for any_errors_fatal 15494 1726853341.33710: done checking for any_errors_fatal 15494 1726853341.33711: checking for max_fail_percentage 15494 1726853341.33712: done checking for max_fail_percentage 15494 1726853341.33713: checking to see if all hosts have failed and the running result is not ok 15494 1726853341.33714: done checking to see if all hosts have failed 15494 1726853341.33715: getting the remaining hosts for this loop 15494 1726853341.33716: done getting the remaining hosts for this loop 15494 1726853341.33719: getting the next task for host managed_node1 15494 1726853341.33726: done getting next task for host managed_node1 15494 1726853341.33729: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15494 1726853341.33731: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853341.33744: getting variables 15494 1726853341.33745: in VariableManager get_vars() 15494 1726853341.33777: Calling all_inventory to load vars for managed_node1 15494 1726853341.33779: Calling groups_inventory to load vars for managed_node1 15494 1726853341.33781: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853341.33789: Calling all_plugins_play to load vars for managed_node1 15494 1726853341.33791: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853341.33793: Calling groups_plugins_play to load vars for managed_node1 15494 1726853341.34533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853341.35410: done with get_vars() 15494 1726853341.35424: done getting variables 15494 1726853341.35495: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:29:01 -0400 (0:00:00.030) 0:00:09.970 ****** 15494 1726853341.35515: entering _queue_task() for managed_node1/service 15494 1726853341.35517: Creating lock for service 15494 1726853341.35739: worker is 1 (out of 1 available) 15494 1726853341.35752: exiting _queue_task() for managed_node1/service 15494 1726853341.35764: done queuing things up, now waiting for results queue to drain 15494 1726853341.35766: waiting for pending results... 15494 1726853341.35935: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15494 1726853341.36005: in run() - task 02083763-bbaf-0028-1a50-00000000001f 15494 1726853341.36016: variable 'ansible_search_path' from source: unknown 15494 1726853341.36019: variable 'ansible_search_path' from source: unknown 15494 1726853341.36050: calling self._execute() 15494 1726853341.36121: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853341.36125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853341.36134: variable 'omit' from source: magic vars 15494 1726853341.36401: variable 'ansible_distribution_major_version' from source: facts 15494 1726853341.36410: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853341.36493: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853341.36619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853341.38282: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853341.38332: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853341.38359: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853341.38390: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853341.38410: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853341.38468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.38494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.38511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.38537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.38550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.38584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.38602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.38618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.38643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.38654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.38684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.38704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.38719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.38743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.38754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.38868: variable 'network_connections' from source: play vars 15494 1726853341.38882: variable 'interface' from source: set_fact 15494 1726853341.38938: variable 'interface' from source: set_fact 15494 1726853341.38948: variable 'interface' from source: set_fact 15494 1726853341.38989: variable 'interface' from source: set_fact 15494 1726853341.39041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853341.39152: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853341.39179: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853341.39211: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853341.39232: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853341.39266: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853341.39283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853341.39300: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.39317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853341.39364: variable '__network_team_connections_defined' from source: role '' defaults 15494 1726853341.39511: variable 'network_connections' from source: play vars 15494 1726853341.39514: variable 'interface' from source: set_fact 15494 1726853341.39556: variable 'interface' from source: set_fact 15494 1726853341.39562: variable 'interface' from source: set_fact 15494 1726853341.39606: variable 'interface' from source: set_fact 15494 1726853341.39630: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15494 1726853341.39634: when evaluation is False, skipping this task 15494 1726853341.39636: _execute() done 15494 1726853341.39639: dumping result to json 15494 1726853341.39641: done dumping result, returning 15494 1726853341.39650: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-0028-1a50-00000000001f] 15494 1726853341.39661: sending task result for task 02083763-bbaf-0028-1a50-00000000001f 15494 1726853341.39738: done sending task result for task 02083763-bbaf-0028-1a50-00000000001f 15494 1726853341.39740: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15494 1726853341.39821: no more pending results, returning what we have 15494 1726853341.39825: results queue empty 15494 1726853341.39826: checking for any_errors_fatal 15494 1726853341.39832: done checking for any_errors_fatal 15494 1726853341.39833: checking for max_fail_percentage 15494 1726853341.39835: done checking for max_fail_percentage 15494 1726853341.39836: checking to see if all hosts have failed and the running result is not ok 15494 1726853341.39837: done checking to see if all hosts have failed 15494 1726853341.39837: getting the remaining hosts for this loop 15494 1726853341.39839: done getting the remaining hosts for this loop 15494 1726853341.39843: getting the next task for host managed_node1 15494 1726853341.39853: done getting next task for host managed_node1 15494 1726853341.39857: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15494 1726853341.39858: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853341.39873: getting variables 15494 1726853341.39874: in VariableManager get_vars() 15494 1726853341.39908: Calling all_inventory to load vars for managed_node1 15494 1726853341.39911: Calling groups_inventory to load vars for managed_node1 15494 1726853341.39913: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853341.39924: Calling all_plugins_play to load vars for managed_node1 15494 1726853341.39926: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853341.39928: Calling groups_plugins_play to load vars for managed_node1 15494 1726853341.40821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853341.41718: done with get_vars() 15494 1726853341.41737: done getting variables 15494 1726853341.41785: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:29:01 -0400 (0:00:00.062) 0:00:10.033 ****** 15494 1726853341.41807: entering _queue_task() for managed_node1/service 15494 1726853341.42058: worker is 1 (out of 1 available) 15494 1726853341.42074: exiting _queue_task() for managed_node1/service 15494 1726853341.42085: done queuing things up, now waiting for results queue to drain 15494 1726853341.42087: waiting for pending results... 15494 1726853341.42255: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15494 1726853341.42321: in run() - task 02083763-bbaf-0028-1a50-000000000020 15494 1726853341.42333: variable 'ansible_search_path' from source: unknown 15494 1726853341.42337: variable 'ansible_search_path' from source: unknown 15494 1726853341.42366: calling self._execute() 15494 1726853341.42441: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853341.42445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853341.42455: variable 'omit' from source: magic vars 15494 1726853341.42726: variable 'ansible_distribution_major_version' from source: facts 15494 1726853341.42735: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853341.42842: variable 'network_provider' from source: set_fact 15494 1726853341.42848: variable 'network_state' from source: role '' defaults 15494 1726853341.42860: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15494 1726853341.42863: variable 'omit' from source: magic vars 15494 1726853341.42890: variable 'omit' from source: magic vars 15494 1726853341.42912: variable 'network_service_name' from source: role '' defaults 15494 1726853341.42967: variable 'network_service_name' from source: role '' defaults 15494 1726853341.43035: variable '__network_provider_setup' from source: role '' defaults 15494 1726853341.43039: variable '__network_service_name_default_nm' from source: role '' defaults 15494 1726853341.43087: variable '__network_service_name_default_nm' from source: role '' defaults 15494 1726853341.43095: variable '__network_packages_default_nm' from source: role '' defaults 15494 1726853341.43138: variable '__network_packages_default_nm' from source: role '' defaults 15494 1726853341.43284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853341.44741: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853341.44795: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853341.44824: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853341.44852: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853341.44870: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853341.44932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.44953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.44973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.44998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.45009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.45044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.45061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.45080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.45104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.45114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.45262: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15494 1726853341.45339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.45357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.45381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.45405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.45415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.45485: variable 'ansible_python' from source: facts 15494 1726853341.45503: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15494 1726853341.45561: variable '__network_wpa_supplicant_required' from source: role '' defaults 15494 1726853341.45616: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15494 1726853341.45700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.45717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.45733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.45760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.45770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.45807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853341.45828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853341.45845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.45873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853341.45883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853341.45977: variable 'network_connections' from source: play vars 15494 1726853341.45984: variable 'interface' from source: set_fact 15494 1726853341.46037: variable 'interface' from source: set_fact 15494 1726853341.46046: variable 'interface' from source: set_fact 15494 1726853341.46099: variable 'interface' from source: set_fact 15494 1726853341.46174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853341.46305: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853341.46342: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853341.46375: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853341.46407: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853341.46456: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853341.46478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853341.46500: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853341.46522: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853341.46561: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853341.46735: variable 'network_connections' from source: play vars 15494 1726853341.46741: variable 'interface' from source: set_fact 15494 1726853341.46798: variable 'interface' from source: set_fact 15494 1726853341.46807: variable 'interface' from source: set_fact 15494 1726853341.46857: variable 'interface' from source: set_fact 15494 1726853341.46895: variable '__network_packages_default_wireless' from source: role '' defaults 15494 1726853341.46946: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853341.47133: variable 'network_connections' from source: play vars 15494 1726853341.47136: variable 'interface' from source: set_fact 15494 1726853341.47189: variable 'interface' from source: set_fact 15494 1726853341.47193: variable 'interface' from source: set_fact 15494 1726853341.47243: variable 'interface' from source: set_fact 15494 1726853341.47264: variable '__network_packages_default_team' from source: role '' defaults 15494 1726853341.47320: variable '__network_team_connections_defined' from source: role '' defaults 15494 1726853341.47502: variable 'network_connections' from source: play vars 15494 1726853341.47506: variable 'interface' from source: set_fact 15494 1726853341.47559: variable 'interface' from source: set_fact 15494 1726853341.47564: variable 'interface' from source: set_fact 15494 1726853341.47613: variable 'interface' from source: set_fact 15494 1726853341.47660: variable '__network_service_name_default_initscripts' from source: role '' defaults 15494 1726853341.47702: variable '__network_service_name_default_initscripts' from source: role '' defaults 15494 1726853341.47708: variable '__network_packages_default_initscripts' from source: role '' defaults 15494 1726853341.47753: variable '__network_packages_default_initscripts' from source: role '' defaults 15494 1726853341.47890: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15494 1726853341.48343: variable 'network_connections' from source: play vars 15494 1726853341.48347: variable 'interface' from source: set_fact 15494 1726853341.48393: variable 'interface' from source: set_fact 15494 1726853341.48403: variable 'interface' from source: set_fact 15494 1726853341.48441: variable 'interface' from source: set_fact 15494 1726853341.48448: variable 'ansible_distribution' from source: facts 15494 1726853341.48455: variable '__network_rh_distros' from source: role '' defaults 15494 1726853341.48460: variable 'ansible_distribution_major_version' from source: facts 15494 1726853341.48484: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15494 1726853341.48597: variable 'ansible_distribution' from source: facts 15494 1726853341.48600: variable '__network_rh_distros' from source: role '' defaults 15494 1726853341.48605: variable 'ansible_distribution_major_version' from source: facts 15494 1726853341.48615: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15494 1726853341.48727: variable 'ansible_distribution' from source: facts 15494 1726853341.48732: variable '__network_rh_distros' from source: role '' defaults 15494 1726853341.48734: variable 'ansible_distribution_major_version' from source: facts 15494 1726853341.48764: variable 'network_provider' from source: set_fact 15494 1726853341.48783: variable 'omit' from source: magic vars 15494 1726853341.48806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853341.48827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853341.48841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853341.48860: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853341.48868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853341.48892: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853341.48895: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853341.48898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853341.48963: Set connection var ansible_connection to ssh 15494 1726853341.48975: Set connection var ansible_pipelining to False 15494 1726853341.48980: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853341.48983: Set connection var ansible_shell_type to sh 15494 1726853341.48988: Set connection var ansible_timeout to 10 15494 1726853341.48994: Set connection var ansible_shell_executable to /bin/sh 15494 1726853341.49013: variable 'ansible_shell_executable' from source: unknown 15494 1726853341.49015: variable 'ansible_connection' from source: unknown 15494 1726853341.49019: variable 'ansible_module_compression' from source: unknown 15494 1726853341.49021: variable 'ansible_shell_type' from source: unknown 15494 1726853341.49024: variable 'ansible_shell_executable' from source: unknown 15494 1726853341.49026: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853341.49033: variable 'ansible_pipelining' from source: unknown 15494 1726853341.49035: variable 'ansible_timeout' from source: unknown 15494 1726853341.49037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853341.49124: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853341.49132: variable 'omit' from source: magic vars 15494 1726853341.49138: starting attempt loop 15494 1726853341.49141: running the handler 15494 1726853341.49199: variable 'ansible_facts' from source: unknown 15494 1726853341.49635: _low_level_execute_command(): starting 15494 1726853341.49640: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853341.50129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853341.50133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853341.50136: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853341.50138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853341.50195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853341.50198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853341.50201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853341.50253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853341.51935: stdout chunk (state=3): >>>/root <<< 15494 1726853341.52032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853341.52064: stderr chunk (state=3): >>><<< 15494 1726853341.52068: stdout chunk (state=3): >>><<< 15494 1726853341.52089: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853341.52101: _low_level_execute_command(): starting 15494 1726853341.52107: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853341.5208943-16040-46315781922577 `" && echo ansible-tmp-1726853341.5208943-16040-46315781922577="` echo /root/.ansible/tmp/ansible-tmp-1726853341.5208943-16040-46315781922577 `" ) && sleep 0' 15494 1726853341.52549: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853341.52552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853341.52555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853341.52557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853341.52559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853341.52613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853341.52620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853341.52622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853341.52661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853341.54610: stdout chunk (state=3): >>>ansible-tmp-1726853341.5208943-16040-46315781922577=/root/.ansible/tmp/ansible-tmp-1726853341.5208943-16040-46315781922577 <<< 15494 1726853341.54715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853341.54740: stderr chunk (state=3): >>><<< 15494 1726853341.54743: stdout chunk (state=3): >>><<< 15494 1726853341.54760: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853341.5208943-16040-46315781922577=/root/.ansible/tmp/ansible-tmp-1726853341.5208943-16040-46315781922577 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853341.54788: variable 'ansible_module_compression' from source: unknown 15494 1726853341.54835: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 15494 1726853341.54839: ANSIBALLZ: Acquiring lock 15494 1726853341.54846: ANSIBALLZ: Lock acquired: 140002372342736 15494 1726853341.54849: ANSIBALLZ: Creating module 15494 1726853341.73011: ANSIBALLZ: Writing module into payload 15494 1726853341.73118: ANSIBALLZ: Writing module 15494 1726853341.73143: ANSIBALLZ: Renaming module 15494 1726853341.73147: ANSIBALLZ: Done creating module 15494 1726853341.73181: variable 'ansible_facts' from source: unknown 15494 1726853341.73319: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853341.5208943-16040-46315781922577/AnsiballZ_systemd.py 15494 1726853341.73429: Sending initial data 15494 1726853341.73432: Sent initial data (155 bytes) 15494 1726853341.73908: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853341.73912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853341.73914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853341.73916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853341.73956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853341.73976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853341.73979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853341.74034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853341.75711: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853341.75751: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853341.75788: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpkf2ivntk /root/.ansible/tmp/ansible-tmp-1726853341.5208943-16040-46315781922577/AnsiballZ_systemd.py <<< 15494 1726853341.75797: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853341.5208943-16040-46315781922577/AnsiballZ_systemd.py" <<< 15494 1726853341.75834: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpkf2ivntk" to remote "/root/.ansible/tmp/ansible-tmp-1726853341.5208943-16040-46315781922577/AnsiballZ_systemd.py" <<< 15494 1726853341.75837: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853341.5208943-16040-46315781922577/AnsiballZ_systemd.py" <<< 15494 1726853341.76909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853341.76958: stderr chunk (state=3): >>><<< 15494 1726853341.76961: stdout chunk (state=3): >>><<< 15494 1726853341.76987: done transferring module to remote 15494 1726853341.76996: _low_level_execute_command(): starting 15494 1726853341.77001: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853341.5208943-16040-46315781922577/ /root/.ansible/tmp/ansible-tmp-1726853341.5208943-16040-46315781922577/AnsiballZ_systemd.py && sleep 0' 15494 1726853341.77459: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853341.77463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853341.77465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15494 1726853341.77469: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853341.77523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853341.77526: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853341.77530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853341.77569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853341.79653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853341.79658: stdout chunk (state=3): >>><<< 15494 1726853341.79661: stderr chunk (state=3): >>><<< 15494 1726853341.79783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853341.79788: _low_level_execute_command(): starting 15494 1726853341.79791: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853341.5208943-16040-46315781922577/AnsiballZ_systemd.py && sleep 0' 15494 1726853341.80389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853341.80404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853341.80436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853341.80462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853341.80581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853341.80585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853341.80603: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853341.80626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853341.80730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853342.10370: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call or<<< 15494 1726853342.10412: stdout chunk (state=3): >>>g.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10596352", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313254400", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "721810000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "Private<<< 15494 1726853342.10417: stdout chunk (state=3): >>>IPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15494 1726853342.12578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853342.12583: stdout chunk (state=3): >>><<< 15494 1726853342.12585: stderr chunk (state=3): >>><<< 15494 1726853342.12589: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10596352", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313254400", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "721810000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853342.12717: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853341.5208943-16040-46315781922577/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853342.12745: _low_level_execute_command(): starting 15494 1726853342.12825: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853341.5208943-16040-46315781922577/ > /dev/null 2>&1 && sleep 0' 15494 1726853342.13419: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853342.13436: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853342.13452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853342.13582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853342.13587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853342.13617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853342.13634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853342.13710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853342.15524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853342.15556: stderr chunk (state=3): >>><<< 15494 1726853342.15559: stdout chunk (state=3): >>><<< 15494 1726853342.15574: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853342.15581: handler run complete 15494 1726853342.15618: attempt loop complete, returning result 15494 1726853342.15621: _execute() done 15494 1726853342.15623: dumping result to json 15494 1726853342.15635: done dumping result, returning 15494 1726853342.15647: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-0028-1a50-000000000020] 15494 1726853342.15650: sending task result for task 02083763-bbaf-0028-1a50-000000000020 15494 1726853342.15892: done sending task result for task 02083763-bbaf-0028-1a50-000000000020 15494 1726853342.15895: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15494 1726853342.15940: no more pending results, returning what we have 15494 1726853342.15943: results queue empty 15494 1726853342.15943: checking for any_errors_fatal 15494 1726853342.15953: done checking for any_errors_fatal 15494 1726853342.15954: checking for max_fail_percentage 15494 1726853342.15955: done checking for max_fail_percentage 15494 1726853342.15956: checking to see if all hosts have failed and the running result is not ok 15494 1726853342.15957: done checking to see if all hosts have failed 15494 1726853342.15957: getting the remaining hosts for this loop 15494 1726853342.15959: done getting the remaining hosts for this loop 15494 1726853342.15962: getting the next task for host managed_node1 15494 1726853342.15967: done getting next task for host managed_node1 15494 1726853342.15972: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15494 1726853342.15974: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853342.15989: getting variables 15494 1726853342.15991: in VariableManager get_vars() 15494 1726853342.16030: Calling all_inventory to load vars for managed_node1 15494 1726853342.16032: Calling groups_inventory to load vars for managed_node1 15494 1726853342.16035: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853342.16044: Calling all_plugins_play to load vars for managed_node1 15494 1726853342.16050: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853342.16054: Calling groups_plugins_play to load vars for managed_node1 15494 1726853342.17302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853342.18163: done with get_vars() 15494 1726853342.18182: done getting variables 15494 1726853342.18224: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:29:02 -0400 (0:00:00.764) 0:00:10.798 ****** 15494 1726853342.18244: entering _queue_task() for managed_node1/service 15494 1726853342.18474: worker is 1 (out of 1 available) 15494 1726853342.18487: exiting _queue_task() for managed_node1/service 15494 1726853342.18499: done queuing things up, now waiting for results queue to drain 15494 1726853342.18500: waiting for pending results... 15494 1726853342.18669: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15494 1726853342.18732: in run() - task 02083763-bbaf-0028-1a50-000000000021 15494 1726853342.18742: variable 'ansible_search_path' from source: unknown 15494 1726853342.18746: variable 'ansible_search_path' from source: unknown 15494 1726853342.18779: calling self._execute() 15494 1726853342.18850: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853342.18857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853342.18865: variable 'omit' from source: magic vars 15494 1726853342.19276: variable 'ansible_distribution_major_version' from source: facts 15494 1726853342.19279: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853342.19282: variable 'network_provider' from source: set_fact 15494 1726853342.19296: Evaluated conditional (network_provider == "nm"): True 15494 1726853342.19389: variable '__network_wpa_supplicant_required' from source: role '' defaults 15494 1726853342.19487: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15494 1726853342.19666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853342.21304: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853342.21352: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853342.21378: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853342.21403: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853342.21422: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853342.21493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853342.21513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853342.21530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853342.21556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853342.21567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853342.21603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853342.21620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853342.21636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853342.21662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853342.21674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853342.21704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853342.21719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853342.21735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853342.21760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853342.21770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853342.21866: variable 'network_connections' from source: play vars 15494 1726853342.21879: variable 'interface' from source: set_fact 15494 1726853342.21932: variable 'interface' from source: set_fact 15494 1726853342.21940: variable 'interface' from source: set_fact 15494 1726853342.21984: variable 'interface' from source: set_fact 15494 1726853342.22036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853342.22172: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853342.22212: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853342.22236: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853342.22272: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853342.22299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853342.22319: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853342.22338: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853342.22476: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853342.22480: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853342.22663: variable 'network_connections' from source: play vars 15494 1726853342.22677: variable 'interface' from source: set_fact 15494 1726853342.22736: variable 'interface' from source: set_fact 15494 1726853342.22754: variable 'interface' from source: set_fact 15494 1726853342.22816: variable 'interface' from source: set_fact 15494 1726853342.22861: Evaluated conditional (__network_wpa_supplicant_required): False 15494 1726853342.22864: when evaluation is False, skipping this task 15494 1726853342.22872: _execute() done 15494 1726853342.22892: dumping result to json 15494 1726853342.22894: done dumping result, returning 15494 1726853342.22897: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-0028-1a50-000000000021] 15494 1726853342.22904: sending task result for task 02083763-bbaf-0028-1a50-000000000021 15494 1726853342.23009: done sending task result for task 02083763-bbaf-0028-1a50-000000000021 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15494 1726853342.23054: no more pending results, returning what we have 15494 1726853342.23059: results queue empty 15494 1726853342.23064: checking for any_errors_fatal 15494 1726853342.23089: done checking for any_errors_fatal 15494 1726853342.23090: checking for max_fail_percentage 15494 1726853342.23092: done checking for max_fail_percentage 15494 1726853342.23093: checking to see if all hosts have failed and the running result is not ok 15494 1726853342.23094: done checking to see if all hosts have failed 15494 1726853342.23094: getting the remaining hosts for this loop 15494 1726853342.23096: done getting the remaining hosts for this loop 15494 1726853342.23178: getting the next task for host managed_node1 15494 1726853342.23183: done getting next task for host managed_node1 15494 1726853342.23187: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15494 1726853342.23189: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853342.23201: getting variables 15494 1726853342.23202: in VariableManager get_vars() 15494 1726853342.23282: Calling all_inventory to load vars for managed_node1 15494 1726853342.23284: Calling groups_inventory to load vars for managed_node1 15494 1726853342.23287: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853342.23293: WORKER PROCESS EXITING 15494 1726853342.23302: Calling all_plugins_play to load vars for managed_node1 15494 1726853342.23305: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853342.23308: Calling groups_plugins_play to load vars for managed_node1 15494 1726853342.24374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853342.25254: done with get_vars() 15494 1726853342.25275: done getting variables 15494 1726853342.25319: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:29:02 -0400 (0:00:00.070) 0:00:10.869 ****** 15494 1726853342.25340: entering _queue_task() for managed_node1/service 15494 1726853342.25592: worker is 1 (out of 1 available) 15494 1726853342.25605: exiting _queue_task() for managed_node1/service 15494 1726853342.25618: done queuing things up, now waiting for results queue to drain 15494 1726853342.25619: waiting for pending results... 15494 1726853342.25788: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 15494 1726853342.25858: in run() - task 02083763-bbaf-0028-1a50-000000000022 15494 1726853342.25869: variable 'ansible_search_path' from source: unknown 15494 1726853342.25874: variable 'ansible_search_path' from source: unknown 15494 1726853342.25903: calling self._execute() 15494 1726853342.26177: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853342.26180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853342.26183: variable 'omit' from source: magic vars 15494 1726853342.26372: variable 'ansible_distribution_major_version' from source: facts 15494 1726853342.26390: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853342.26506: variable 'network_provider' from source: set_fact 15494 1726853342.26517: Evaluated conditional (network_provider == "initscripts"): False 15494 1726853342.26525: when evaluation is False, skipping this task 15494 1726853342.26532: _execute() done 15494 1726853342.26538: dumping result to json 15494 1726853342.26545: done dumping result, returning 15494 1726853342.26556: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-0028-1a50-000000000022] 15494 1726853342.26565: sending task result for task 02083763-bbaf-0028-1a50-000000000022 15494 1726853342.26877: done sending task result for task 02083763-bbaf-0028-1a50-000000000022 15494 1726853342.26881: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15494 1726853342.26913: no more pending results, returning what we have 15494 1726853342.26916: results queue empty 15494 1726853342.26917: checking for any_errors_fatal 15494 1726853342.26931: done checking for any_errors_fatal 15494 1726853342.26932: checking for max_fail_percentage 15494 1726853342.26934: done checking for max_fail_percentage 15494 1726853342.26934: checking to see if all hosts have failed and the running result is not ok 15494 1726853342.26935: done checking to see if all hosts have failed 15494 1726853342.26936: getting the remaining hosts for this loop 15494 1726853342.26937: done getting the remaining hosts for this loop 15494 1726853342.26940: getting the next task for host managed_node1 15494 1726853342.26946: done getting next task for host managed_node1 15494 1726853342.26949: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15494 1726853342.26951: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853342.26964: getting variables 15494 1726853342.26965: in VariableManager get_vars() 15494 1726853342.26998: Calling all_inventory to load vars for managed_node1 15494 1726853342.27001: Calling groups_inventory to load vars for managed_node1 15494 1726853342.27003: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853342.27013: Calling all_plugins_play to load vars for managed_node1 15494 1726853342.27015: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853342.27017: Calling groups_plugins_play to load vars for managed_node1 15494 1726853342.27847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853342.28719: done with get_vars() 15494 1726853342.28732: done getting variables 15494 1726853342.28775: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:29:02 -0400 (0:00:00.034) 0:00:10.903 ****** 15494 1726853342.28797: entering _queue_task() for managed_node1/copy 15494 1726853342.29005: worker is 1 (out of 1 available) 15494 1726853342.29019: exiting _queue_task() for managed_node1/copy 15494 1726853342.29031: done queuing things up, now waiting for results queue to drain 15494 1726853342.29032: waiting for pending results... 15494 1726853342.29195: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15494 1726853342.29266: in run() - task 02083763-bbaf-0028-1a50-000000000023 15494 1726853342.29274: variable 'ansible_search_path' from source: unknown 15494 1726853342.29277: variable 'ansible_search_path' from source: unknown 15494 1726853342.29306: calling self._execute() 15494 1726853342.29378: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853342.29382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853342.29393: variable 'omit' from source: magic vars 15494 1726853342.29655: variable 'ansible_distribution_major_version' from source: facts 15494 1726853342.29665: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853342.29742: variable 'network_provider' from source: set_fact 15494 1726853342.29746: Evaluated conditional (network_provider == "initscripts"): False 15494 1726853342.29751: when evaluation is False, skipping this task 15494 1726853342.29754: _execute() done 15494 1726853342.29757: dumping result to json 15494 1726853342.29761: done dumping result, returning 15494 1726853342.29768: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-0028-1a50-000000000023] 15494 1726853342.29774: sending task result for task 02083763-bbaf-0028-1a50-000000000023 15494 1726853342.29856: done sending task result for task 02083763-bbaf-0028-1a50-000000000023 15494 1726853342.29858: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15494 1726853342.29903: no more pending results, returning what we have 15494 1726853342.29907: results queue empty 15494 1726853342.29908: checking for any_errors_fatal 15494 1726853342.29917: done checking for any_errors_fatal 15494 1726853342.29918: checking for max_fail_percentage 15494 1726853342.29919: done checking for max_fail_percentage 15494 1726853342.29920: checking to see if all hosts have failed and the running result is not ok 15494 1726853342.29921: done checking to see if all hosts have failed 15494 1726853342.29921: getting the remaining hosts for this loop 15494 1726853342.29923: done getting the remaining hosts for this loop 15494 1726853342.29926: getting the next task for host managed_node1 15494 1726853342.29931: done getting next task for host managed_node1 15494 1726853342.29935: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15494 1726853342.29936: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853342.29949: getting variables 15494 1726853342.29950: in VariableManager get_vars() 15494 1726853342.29981: Calling all_inventory to load vars for managed_node1 15494 1726853342.29984: Calling groups_inventory to load vars for managed_node1 15494 1726853342.29986: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853342.29994: Calling all_plugins_play to load vars for managed_node1 15494 1726853342.29996: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853342.29999: Calling groups_plugins_play to load vars for managed_node1 15494 1726853342.30717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853342.31684: done with get_vars() 15494 1726853342.31699: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:29:02 -0400 (0:00:00.029) 0:00:10.933 ****** 15494 1726853342.31755: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15494 1726853342.31757: Creating lock for fedora.linux_system_roles.network_connections 15494 1726853342.31987: worker is 1 (out of 1 available) 15494 1726853342.32001: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15494 1726853342.32013: done queuing things up, now waiting for results queue to drain 15494 1726853342.32015: waiting for pending results... 15494 1726853342.32178: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15494 1726853342.32248: in run() - task 02083763-bbaf-0028-1a50-000000000024 15494 1726853342.32258: variable 'ansible_search_path' from source: unknown 15494 1726853342.32262: variable 'ansible_search_path' from source: unknown 15494 1726853342.32290: calling self._execute() 15494 1726853342.32360: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853342.32363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853342.32374: variable 'omit' from source: magic vars 15494 1726853342.32635: variable 'ansible_distribution_major_version' from source: facts 15494 1726853342.32645: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853342.32653: variable 'omit' from source: magic vars 15494 1726853342.32681: variable 'omit' from source: magic vars 15494 1726853342.32791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853342.34777: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853342.34785: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853342.34792: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853342.34795: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853342.34797: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853342.34814: variable 'network_provider' from source: set_fact 15494 1726853342.34932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853342.34988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853342.35025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853342.35069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853342.35077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853342.35127: variable 'omit' from source: magic vars 15494 1726853342.35208: variable 'omit' from source: magic vars 15494 1726853342.35281: variable 'network_connections' from source: play vars 15494 1726853342.35290: variable 'interface' from source: set_fact 15494 1726853342.35356: variable 'interface' from source: set_fact 15494 1726853342.35361: variable 'interface' from source: set_fact 15494 1726853342.35576: variable 'interface' from source: set_fact 15494 1726853342.35579: variable 'omit' from source: magic vars 15494 1726853342.35581: variable '__lsr_ansible_managed' from source: task vars 15494 1726853342.35621: variable '__lsr_ansible_managed' from source: task vars 15494 1726853342.35783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15494 1726853342.35990: Loaded config def from plugin (lookup/template) 15494 1726853342.35999: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15494 1726853342.36029: File lookup term: get_ansible_managed.j2 15494 1726853342.36036: variable 'ansible_search_path' from source: unknown 15494 1726853342.36044: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15494 1726853342.36060: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15494 1726853342.36083: variable 'ansible_search_path' from source: unknown 15494 1726853342.40720: variable 'ansible_managed' from source: unknown 15494 1726853342.40801: variable 'omit' from source: magic vars 15494 1726853342.40821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853342.40842: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853342.40857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853342.40870: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853342.40881: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853342.40901: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853342.40904: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853342.40907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853342.40968: Set connection var ansible_connection to ssh 15494 1726853342.40974: Set connection var ansible_pipelining to False 15494 1726853342.40981: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853342.40984: Set connection var ansible_shell_type to sh 15494 1726853342.40991: Set connection var ansible_timeout to 10 15494 1726853342.40995: Set connection var ansible_shell_executable to /bin/sh 15494 1726853342.41013: variable 'ansible_shell_executable' from source: unknown 15494 1726853342.41016: variable 'ansible_connection' from source: unknown 15494 1726853342.41018: variable 'ansible_module_compression' from source: unknown 15494 1726853342.41020: variable 'ansible_shell_type' from source: unknown 15494 1726853342.41023: variable 'ansible_shell_executable' from source: unknown 15494 1726853342.41026: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853342.41030: variable 'ansible_pipelining' from source: unknown 15494 1726853342.41033: variable 'ansible_timeout' from source: unknown 15494 1726853342.41037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853342.41130: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853342.41142: variable 'omit' from source: magic vars 15494 1726853342.41144: starting attempt loop 15494 1726853342.41150: running the handler 15494 1726853342.41158: _low_level_execute_command(): starting 15494 1726853342.41166: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853342.41855: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853342.41859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853342.41862: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853342.41864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853342.41908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853342.41950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853342.43612: stdout chunk (state=3): >>>/root <<< 15494 1726853342.43755: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853342.43773: stdout chunk (state=3): >>><<< 15494 1726853342.43786: stderr chunk (state=3): >>><<< 15494 1726853342.43811: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853342.43831: _low_level_execute_command(): starting 15494 1726853342.43843: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853342.438178-16066-204479797377357 `" && echo ansible-tmp-1726853342.438178-16066-204479797377357="` echo /root/.ansible/tmp/ansible-tmp-1726853342.438178-16066-204479797377357 `" ) && sleep 0' 15494 1726853342.44562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853342.44654: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853342.44670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853342.44728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853342.44749: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853342.44807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853342.44855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853342.46760: stdout chunk (state=3): >>>ansible-tmp-1726853342.438178-16066-204479797377357=/root/.ansible/tmp/ansible-tmp-1726853342.438178-16066-204479797377357 <<< 15494 1726853342.46878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853342.46932: stderr chunk (state=3): >>><<< 15494 1726853342.46935: stdout chunk (state=3): >>><<< 15494 1726853342.46953: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853342.438178-16066-204479797377357=/root/.ansible/tmp/ansible-tmp-1726853342.438178-16066-204479797377357 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853342.47182: variable 'ansible_module_compression' from source: unknown 15494 1726853342.47185: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 15494 1726853342.47187: ANSIBALLZ: Acquiring lock 15494 1726853342.47189: ANSIBALLZ: Lock acquired: 140002368290640 15494 1726853342.47191: ANSIBALLZ: Creating module 15494 1726853342.67492: ANSIBALLZ: Writing module into payload 15494 1726853342.67806: ANSIBALLZ: Writing module 15494 1726853342.67838: ANSIBALLZ: Renaming module 15494 1726853342.67851: ANSIBALLZ: Done creating module 15494 1726853342.67884: variable 'ansible_facts' from source: unknown 15494 1726853342.67996: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853342.438178-16066-204479797377357/AnsiballZ_network_connections.py 15494 1726853342.68224: Sending initial data 15494 1726853342.68233: Sent initial data (167 bytes) 15494 1726853342.68727: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853342.68742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853342.68764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853342.68786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853342.68886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853342.68908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853342.68992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853342.70706: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853342.70713: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853342.70757: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmptpd58755 /root/.ansible/tmp/ansible-tmp-1726853342.438178-16066-204479797377357/AnsiballZ_network_connections.py <<< 15494 1726853342.70762: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853342.438178-16066-204479797377357/AnsiballZ_network_connections.py" <<< 15494 1726853342.70798: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmptpd58755" to remote "/root/.ansible/tmp/ansible-tmp-1726853342.438178-16066-204479797377357/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853342.438178-16066-204479797377357/AnsiballZ_network_connections.py" <<< 15494 1726853342.72452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853342.72637: stderr chunk (state=3): >>><<< 15494 1726853342.72640: stdout chunk (state=3): >>><<< 15494 1726853342.72643: done transferring module to remote 15494 1726853342.72645: _low_level_execute_command(): starting 15494 1726853342.72647: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853342.438178-16066-204479797377357/ /root/.ansible/tmp/ansible-tmp-1726853342.438178-16066-204479797377357/AnsiballZ_network_connections.py && sleep 0' 15494 1726853342.73788: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853342.73933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853342.73944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853342.74014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853342.75861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853342.75901: stderr chunk (state=3): >>><<< 15494 1726853342.75921: stdout chunk (state=3): >>><<< 15494 1726853342.75943: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853342.75953: _low_level_execute_command(): starting 15494 1726853342.75963: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853342.438178-16066-204479797377357/AnsiballZ_network_connections.py && sleep 0' 15494 1726853342.76553: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853342.76568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853342.76585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853342.76604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853342.76622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853342.76635: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853342.76651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853342.76682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853342.76862: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853342.76881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853342.77084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853343.08977: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 64b0877f-187d-4c9d-a4e5-a37e4f2875dc\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 64b0877f-187d-4c9d-a4e5-a37e4f2875dc (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15494 1726853343.11040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853343.11108: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 15494 1726853343.11126: stdout chunk (state=3): >>><<< 15494 1726853343.11137: stderr chunk (state=3): >>><<< 15494 1726853343.11164: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 64b0877f-187d-4c9d-a4e5-a37e4f2875dc\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 64b0877f-187d-4c9d-a4e5-a37e4f2875dc (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853343.11215: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'interface_name': 'LSR-TST-br31', 'state': 'up', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853342.438178-16066-204479797377357/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853343.11237: _low_level_execute_command(): starting 15494 1726853343.11249: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853342.438178-16066-204479797377357/ > /dev/null 2>&1 && sleep 0' 15494 1726853343.11926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853343.11941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853343.12124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853343.12195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853343.12382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853343.12407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853343.12477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853343.14467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853343.14496: stdout chunk (state=3): >>><<< 15494 1726853343.14576: stderr chunk (state=3): >>><<< 15494 1726853343.14580: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853343.14593: handler run complete 15494 1726853343.14596: attempt loop complete, returning result 15494 1726853343.14598: _execute() done 15494 1726853343.14600: dumping result to json 15494 1726853343.14636: done dumping result, returning 15494 1726853343.14691: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-0028-1a50-000000000024] 15494 1726853343.14707: sending task result for task 02083763-bbaf-0028-1a50-000000000024 changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 64b0877f-187d-4c9d-a4e5-a37e4f2875dc [004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 64b0877f-187d-4c9d-a4e5-a37e4f2875dc (not-active) 15494 1726853343.15198: no more pending results, returning what we have 15494 1726853343.15202: results queue empty 15494 1726853343.15203: checking for any_errors_fatal 15494 1726853343.15211: done checking for any_errors_fatal 15494 1726853343.15212: checking for max_fail_percentage 15494 1726853343.15214: done checking for max_fail_percentage 15494 1726853343.15215: checking to see if all hosts have failed and the running result is not ok 15494 1726853343.15215: done checking to see if all hosts have failed 15494 1726853343.15216: getting the remaining hosts for this loop 15494 1726853343.15218: done getting the remaining hosts for this loop 15494 1726853343.15227: getting the next task for host managed_node1 15494 1726853343.15234: done getting next task for host managed_node1 15494 1726853343.15238: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15494 1726853343.15241: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853343.15253: getting variables 15494 1726853343.15255: in VariableManager get_vars() 15494 1726853343.15514: Calling all_inventory to load vars for managed_node1 15494 1726853343.15617: Calling groups_inventory to load vars for managed_node1 15494 1726853343.15621: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853343.15627: done sending task result for task 02083763-bbaf-0028-1a50-000000000024 15494 1726853343.15630: WORKER PROCESS EXITING 15494 1726853343.15644: Calling all_plugins_play to load vars for managed_node1 15494 1726853343.15652: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853343.15656: Calling groups_plugins_play to load vars for managed_node1 15494 1726853343.19142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853343.21518: done with get_vars() 15494 1726853343.21599: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:29:03 -0400 (0:00:00.900) 0:00:11.834 ****** 15494 1726853343.21836: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15494 1726853343.21838: Creating lock for fedora.linux_system_roles.network_state 15494 1726853343.22398: worker is 1 (out of 1 available) 15494 1726853343.22449: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15494 1726853343.22464: done queuing things up, now waiting for results queue to drain 15494 1726853343.22465: waiting for pending results... 15494 1726853343.22749: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 15494 1726853343.22866: in run() - task 02083763-bbaf-0028-1a50-000000000025 15494 1726853343.22894: variable 'ansible_search_path' from source: unknown 15494 1726853343.22903: variable 'ansible_search_path' from source: unknown 15494 1726853343.22948: calling self._execute() 15494 1726853343.23063: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853343.23081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853343.23100: variable 'omit' from source: magic vars 15494 1726853343.23474: variable 'ansible_distribution_major_version' from source: facts 15494 1726853343.23490: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853343.23612: variable 'network_state' from source: role '' defaults 15494 1726853343.23627: Evaluated conditional (network_state != {}): False 15494 1726853343.23634: when evaluation is False, skipping this task 15494 1726853343.23642: _execute() done 15494 1726853343.23649: dumping result to json 15494 1726853343.23657: done dumping result, returning 15494 1726853343.23667: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-0028-1a50-000000000025] 15494 1726853343.23681: sending task result for task 02083763-bbaf-0028-1a50-000000000025 15494 1726853343.23945: done sending task result for task 02083763-bbaf-0028-1a50-000000000025 15494 1726853343.23948: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15494 1726853343.23999: no more pending results, returning what we have 15494 1726853343.24003: results queue empty 15494 1726853343.24004: checking for any_errors_fatal 15494 1726853343.24016: done checking for any_errors_fatal 15494 1726853343.24016: checking for max_fail_percentage 15494 1726853343.24018: done checking for max_fail_percentage 15494 1726853343.24019: checking to see if all hosts have failed and the running result is not ok 15494 1726853343.24020: done checking to see if all hosts have failed 15494 1726853343.24021: getting the remaining hosts for this loop 15494 1726853343.24023: done getting the remaining hosts for this loop 15494 1726853343.24026: getting the next task for host managed_node1 15494 1726853343.24034: done getting next task for host managed_node1 15494 1726853343.24038: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15494 1726853343.24040: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853343.24057: getting variables 15494 1726853343.24059: in VariableManager get_vars() 15494 1726853343.24101: Calling all_inventory to load vars for managed_node1 15494 1726853343.24104: Calling groups_inventory to load vars for managed_node1 15494 1726853343.24107: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853343.24119: Calling all_plugins_play to load vars for managed_node1 15494 1726853343.24122: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853343.24125: Calling groups_plugins_play to load vars for managed_node1 15494 1726853343.25664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853343.27215: done with get_vars() 15494 1726853343.27238: done getting variables 15494 1726853343.27304: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:29:03 -0400 (0:00:00.055) 0:00:11.889 ****** 15494 1726853343.27339: entering _queue_task() for managed_node1/debug 15494 1726853343.27668: worker is 1 (out of 1 available) 15494 1726853343.27882: exiting _queue_task() for managed_node1/debug 15494 1726853343.27893: done queuing things up, now waiting for results queue to drain 15494 1726853343.27895: waiting for pending results... 15494 1726853343.27967: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15494 1726853343.28079: in run() - task 02083763-bbaf-0028-1a50-000000000026 15494 1726853343.28099: variable 'ansible_search_path' from source: unknown 15494 1726853343.28106: variable 'ansible_search_path' from source: unknown 15494 1726853343.28151: calling self._execute() 15494 1726853343.28260: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853343.28276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853343.28294: variable 'omit' from source: magic vars 15494 1726853343.28692: variable 'ansible_distribution_major_version' from source: facts 15494 1726853343.28773: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853343.28778: variable 'omit' from source: magic vars 15494 1726853343.28781: variable 'omit' from source: magic vars 15494 1726853343.28811: variable 'omit' from source: magic vars 15494 1726853343.28860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853343.28909: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853343.28936: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853343.28960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853343.28986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853343.29020: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853343.29031: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853343.29039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853343.29204: Set connection var ansible_connection to ssh 15494 1726853343.29207: Set connection var ansible_pipelining to False 15494 1726853343.29209: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853343.29211: Set connection var ansible_shell_type to sh 15494 1726853343.29213: Set connection var ansible_timeout to 10 15494 1726853343.29215: Set connection var ansible_shell_executable to /bin/sh 15494 1726853343.29229: variable 'ansible_shell_executable' from source: unknown 15494 1726853343.29237: variable 'ansible_connection' from source: unknown 15494 1726853343.29245: variable 'ansible_module_compression' from source: unknown 15494 1726853343.29251: variable 'ansible_shell_type' from source: unknown 15494 1726853343.29256: variable 'ansible_shell_executable' from source: unknown 15494 1726853343.29261: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853343.29273: variable 'ansible_pipelining' from source: unknown 15494 1726853343.29282: variable 'ansible_timeout' from source: unknown 15494 1726853343.29291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853343.29428: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853343.29475: variable 'omit' from source: magic vars 15494 1726853343.29480: starting attempt loop 15494 1726853343.29483: running the handler 15494 1726853343.29596: variable '__network_connections_result' from source: set_fact 15494 1726853343.29657: handler run complete 15494 1726853343.29686: attempt loop complete, returning result 15494 1726853343.29746: _execute() done 15494 1726853343.29749: dumping result to json 15494 1726853343.29752: done dumping result, returning 15494 1726853343.29754: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-0028-1a50-000000000026] 15494 1726853343.29756: sending task result for task 02083763-bbaf-0028-1a50-000000000026 15494 1726853343.29829: done sending task result for task 02083763-bbaf-0028-1a50-000000000026 ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 64b0877f-187d-4c9d-a4e5-a37e4f2875dc", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 64b0877f-187d-4c9d-a4e5-a37e4f2875dc (not-active)" ] } 15494 1726853343.29918: no more pending results, returning what we have 15494 1726853343.29923: results queue empty 15494 1726853343.29924: checking for any_errors_fatal 15494 1726853343.29933: done checking for any_errors_fatal 15494 1726853343.29934: checking for max_fail_percentage 15494 1726853343.29936: done checking for max_fail_percentage 15494 1726853343.29937: checking to see if all hosts have failed and the running result is not ok 15494 1726853343.29938: done checking to see if all hosts have failed 15494 1726853343.29938: getting the remaining hosts for this loop 15494 1726853343.29940: done getting the remaining hosts for this loop 15494 1726853343.29943: getting the next task for host managed_node1 15494 1726853343.29950: done getting next task for host managed_node1 15494 1726853343.29954: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15494 1726853343.29957: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853343.29966: getting variables 15494 1726853343.29968: in VariableManager get_vars() 15494 1726853343.30007: Calling all_inventory to load vars for managed_node1 15494 1726853343.30009: Calling groups_inventory to load vars for managed_node1 15494 1726853343.30012: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853343.30024: Calling all_plugins_play to load vars for managed_node1 15494 1726853343.30027: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853343.30030: Calling groups_plugins_play to load vars for managed_node1 15494 1726853343.30684: WORKER PROCESS EXITING 15494 1726853343.31599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853343.33147: done with get_vars() 15494 1726853343.33174: done getting variables 15494 1726853343.33232: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:29:03 -0400 (0:00:00.059) 0:00:11.948 ****** 15494 1726853343.33262: entering _queue_task() for managed_node1/debug 15494 1726853343.33577: worker is 1 (out of 1 available) 15494 1726853343.33589: exiting _queue_task() for managed_node1/debug 15494 1726853343.33601: done queuing things up, now waiting for results queue to drain 15494 1726853343.33602: waiting for pending results... 15494 1726853343.33864: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15494 1726853343.33964: in run() - task 02083763-bbaf-0028-1a50-000000000027 15494 1726853343.33991: variable 'ansible_search_path' from source: unknown 15494 1726853343.33998: variable 'ansible_search_path' from source: unknown 15494 1726853343.34037: calling self._execute() 15494 1726853343.34133: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853343.34144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853343.34158: variable 'omit' from source: magic vars 15494 1726853343.34530: variable 'ansible_distribution_major_version' from source: facts 15494 1726853343.34545: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853343.34557: variable 'omit' from source: magic vars 15494 1726853343.34599: variable 'omit' from source: magic vars 15494 1726853343.34642: variable 'omit' from source: magic vars 15494 1726853343.34686: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853343.34724: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853343.34752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853343.34774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853343.34790: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853343.34820: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853343.34828: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853343.34835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853343.34934: Set connection var ansible_connection to ssh 15494 1726853343.34945: Set connection var ansible_pipelining to False 15494 1726853343.34959: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853343.34965: Set connection var ansible_shell_type to sh 15494 1726853343.34975: Set connection var ansible_timeout to 10 15494 1726853343.35175: Set connection var ansible_shell_executable to /bin/sh 15494 1726853343.35178: variable 'ansible_shell_executable' from source: unknown 15494 1726853343.35180: variable 'ansible_connection' from source: unknown 15494 1726853343.35183: variable 'ansible_module_compression' from source: unknown 15494 1726853343.35185: variable 'ansible_shell_type' from source: unknown 15494 1726853343.35187: variable 'ansible_shell_executable' from source: unknown 15494 1726853343.35189: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853343.35191: variable 'ansible_pipelining' from source: unknown 15494 1726853343.35193: variable 'ansible_timeout' from source: unknown 15494 1726853343.35194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853343.35197: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853343.35199: variable 'omit' from source: magic vars 15494 1726853343.35201: starting attempt loop 15494 1726853343.35203: running the handler 15494 1726853343.35251: variable '__network_connections_result' from source: set_fact 15494 1726853343.35333: variable '__network_connections_result' from source: set_fact 15494 1726853343.35456: handler run complete 15494 1726853343.35489: attempt loop complete, returning result 15494 1726853343.35496: _execute() done 15494 1726853343.35502: dumping result to json 15494 1726853343.35509: done dumping result, returning 15494 1726853343.35520: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-0028-1a50-000000000027] 15494 1726853343.35528: sending task result for task 02083763-bbaf-0028-1a50-000000000027 ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 64b0877f-187d-4c9d-a4e5-a37e4f2875dc\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 64b0877f-187d-4c9d-a4e5-a37e4f2875dc (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 64b0877f-187d-4c9d-a4e5-a37e4f2875dc", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 64b0877f-187d-4c9d-a4e5-a37e4f2875dc (not-active)" ] } } 15494 1726853343.35726: no more pending results, returning what we have 15494 1726853343.35730: results queue empty 15494 1726853343.35730: checking for any_errors_fatal 15494 1726853343.35736: done checking for any_errors_fatal 15494 1726853343.35737: checking for max_fail_percentage 15494 1726853343.35739: done checking for max_fail_percentage 15494 1726853343.35739: checking to see if all hosts have failed and the running result is not ok 15494 1726853343.35740: done checking to see if all hosts have failed 15494 1726853343.35741: getting the remaining hosts for this loop 15494 1726853343.35742: done getting the remaining hosts for this loop 15494 1726853343.35746: getting the next task for host managed_node1 15494 1726853343.35753: done getting next task for host managed_node1 15494 1726853343.35756: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15494 1726853343.35759: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853343.35768: getting variables 15494 1726853343.35770: in VariableManager get_vars() 15494 1726853343.35806: Calling all_inventory to load vars for managed_node1 15494 1726853343.35808: Calling groups_inventory to load vars for managed_node1 15494 1726853343.35810: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853343.35820: Calling all_plugins_play to load vars for managed_node1 15494 1726853343.35824: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853343.35826: Calling groups_plugins_play to load vars for managed_node1 15494 1726853343.36585: done sending task result for task 02083763-bbaf-0028-1a50-000000000027 15494 1726853343.36589: WORKER PROCESS EXITING 15494 1726853343.37510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853343.38996: done with get_vars() 15494 1726853343.39022: done getting variables 15494 1726853343.39084: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:29:03 -0400 (0:00:00.058) 0:00:12.006 ****** 15494 1726853343.39116: entering _queue_task() for managed_node1/debug 15494 1726853343.39435: worker is 1 (out of 1 available) 15494 1726853343.39448: exiting _queue_task() for managed_node1/debug 15494 1726853343.39460: done queuing things up, now waiting for results queue to drain 15494 1726853343.39461: waiting for pending results... 15494 1726853343.39724: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15494 1726853343.39837: in run() - task 02083763-bbaf-0028-1a50-000000000028 15494 1726853343.39858: variable 'ansible_search_path' from source: unknown 15494 1726853343.39865: variable 'ansible_search_path' from source: unknown 15494 1726853343.39911: calling self._execute() 15494 1726853343.40010: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853343.40020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853343.40034: variable 'omit' from source: magic vars 15494 1726853343.40404: variable 'ansible_distribution_major_version' from source: facts 15494 1726853343.40420: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853343.40548: variable 'network_state' from source: role '' defaults 15494 1726853343.40565: Evaluated conditional (network_state != {}): False 15494 1726853343.40574: when evaluation is False, skipping this task 15494 1726853343.40581: _execute() done 15494 1726853343.40587: dumping result to json 15494 1726853343.40594: done dumping result, returning 15494 1726853343.40605: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-0028-1a50-000000000028] 15494 1726853343.40614: sending task result for task 02083763-bbaf-0028-1a50-000000000028 skipping: [managed_node1] => { "false_condition": "network_state != {}" } 15494 1726853343.40750: no more pending results, returning what we have 15494 1726853343.40756: results queue empty 15494 1726853343.40757: checking for any_errors_fatal 15494 1726853343.40767: done checking for any_errors_fatal 15494 1726853343.40768: checking for max_fail_percentage 15494 1726853343.40769: done checking for max_fail_percentage 15494 1726853343.40772: checking to see if all hosts have failed and the running result is not ok 15494 1726853343.40773: done checking to see if all hosts have failed 15494 1726853343.40773: getting the remaining hosts for this loop 15494 1726853343.40775: done getting the remaining hosts for this loop 15494 1726853343.40779: getting the next task for host managed_node1 15494 1726853343.40786: done getting next task for host managed_node1 15494 1726853343.40790: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15494 1726853343.40794: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853343.40807: getting variables 15494 1726853343.40809: in VariableManager get_vars() 15494 1726853343.40845: Calling all_inventory to load vars for managed_node1 15494 1726853343.40848: Calling groups_inventory to load vars for managed_node1 15494 1726853343.40850: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853343.40862: Calling all_plugins_play to load vars for managed_node1 15494 1726853343.40865: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853343.40868: Calling groups_plugins_play to load vars for managed_node1 15494 1726853343.41163: done sending task result for task 02083763-bbaf-0028-1a50-000000000028 15494 1726853343.41166: WORKER PROCESS EXITING 15494 1726853343.42400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853343.43921: done with get_vars() 15494 1726853343.43945: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:29:03 -0400 (0:00:00.049) 0:00:12.056 ****** 15494 1726853343.44036: entering _queue_task() for managed_node1/ping 15494 1726853343.44038: Creating lock for ping 15494 1726853343.44346: worker is 1 (out of 1 available) 15494 1726853343.44359: exiting _queue_task() for managed_node1/ping 15494 1726853343.44573: done queuing things up, now waiting for results queue to drain 15494 1726853343.44575: waiting for pending results... 15494 1726853343.44635: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15494 1726853343.44744: in run() - task 02083763-bbaf-0028-1a50-000000000029 15494 1726853343.44766: variable 'ansible_search_path' from source: unknown 15494 1726853343.44778: variable 'ansible_search_path' from source: unknown 15494 1726853343.44822: calling self._execute() 15494 1726853343.44919: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853343.44930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853343.44976: variable 'omit' from source: magic vars 15494 1726853343.45325: variable 'ansible_distribution_major_version' from source: facts 15494 1726853343.45346: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853343.45358: variable 'omit' from source: magic vars 15494 1726853343.45400: variable 'omit' from source: magic vars 15494 1726853343.45450: variable 'omit' from source: magic vars 15494 1726853343.45489: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853343.45558: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853343.45562: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853343.45578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853343.45596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853343.45628: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853343.45638: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853343.45666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853343.45752: Set connection var ansible_connection to ssh 15494 1726853343.45765: Set connection var ansible_pipelining to False 15494 1726853343.45884: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853343.45887: Set connection var ansible_shell_type to sh 15494 1726853343.45889: Set connection var ansible_timeout to 10 15494 1726853343.45891: Set connection var ansible_shell_executable to /bin/sh 15494 1726853343.45893: variable 'ansible_shell_executable' from source: unknown 15494 1726853343.45894: variable 'ansible_connection' from source: unknown 15494 1726853343.45897: variable 'ansible_module_compression' from source: unknown 15494 1726853343.45898: variable 'ansible_shell_type' from source: unknown 15494 1726853343.45900: variable 'ansible_shell_executable' from source: unknown 15494 1726853343.45902: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853343.45903: variable 'ansible_pipelining' from source: unknown 15494 1726853343.45905: variable 'ansible_timeout' from source: unknown 15494 1726853343.45907: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853343.46062: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853343.46082: variable 'omit' from source: magic vars 15494 1726853343.46092: starting attempt loop 15494 1726853343.46105: running the handler 15494 1726853343.46125: _low_level_execute_command(): starting 15494 1726853343.46137: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853343.46972: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853343.46987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853343.47006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853343.47090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853343.49078: stdout chunk (state=3): >>>/root <<< 15494 1726853343.49081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853343.49083: stdout chunk (state=3): >>><<< 15494 1726853343.49086: stderr chunk (state=3): >>><<< 15494 1726853343.49090: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853343.49092: _low_level_execute_command(): starting 15494 1726853343.49098: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853343.4907944-16110-19273560372139 `" && echo ansible-tmp-1726853343.4907944-16110-19273560372139="` echo /root/.ansible/tmp/ansible-tmp-1726853343.4907944-16110-19273560372139 `" ) && sleep 0' 15494 1726853343.50628: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853343.50643: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853343.50687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853343.50699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853343.50757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853343.50794: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853343.50804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853343.50818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853343.50889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853343.52781: stdout chunk (state=3): >>>ansible-tmp-1726853343.4907944-16110-19273560372139=/root/.ansible/tmp/ansible-tmp-1726853343.4907944-16110-19273560372139 <<< 15494 1726853343.53076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853343.53079: stdout chunk (state=3): >>><<< 15494 1726853343.53082: stderr chunk (state=3): >>><<< 15494 1726853343.53084: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853343.4907944-16110-19273560372139=/root/.ansible/tmp/ansible-tmp-1726853343.4907944-16110-19273560372139 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853343.53086: variable 'ansible_module_compression' from source: unknown 15494 1726853343.53088: ANSIBALLZ: Using lock for ping 15494 1726853343.53090: ANSIBALLZ: Acquiring lock 15494 1726853343.53092: ANSIBALLZ: Lock acquired: 140002368277872 15494 1726853343.53094: ANSIBALLZ: Creating module 15494 1726853343.69313: ANSIBALLZ: Writing module into payload 15494 1726853343.69385: ANSIBALLZ: Writing module 15494 1726853343.69412: ANSIBALLZ: Renaming module 15494 1726853343.69422: ANSIBALLZ: Done creating module 15494 1726853343.69444: variable 'ansible_facts' from source: unknown 15494 1726853343.69524: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853343.4907944-16110-19273560372139/AnsiballZ_ping.py 15494 1726853343.69715: Sending initial data 15494 1726853343.69718: Sent initial data (152 bytes) 15494 1726853343.70296: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853343.70310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853343.70355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853343.70440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853343.70455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853343.70468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853343.70574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853343.72222: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853343.72256: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853343.72336: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853343.4907944-16110-19273560372139/AnsiballZ_ping.py" <<< 15494 1726853343.72351: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpglptmour /root/.ansible/tmp/ansible-tmp-1726853343.4907944-16110-19273560372139/AnsiballZ_ping.py <<< 15494 1726853343.72531: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpglptmour" to remote "/root/.ansible/tmp/ansible-tmp-1726853343.4907944-16110-19273560372139/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853343.4907944-16110-19273560372139/AnsiballZ_ping.py" <<< 15494 1726853343.73714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853343.73800: stderr chunk (state=3): >>><<< 15494 1726853343.73814: stdout chunk (state=3): >>><<< 15494 1726853343.74044: done transferring module to remote 15494 1726853343.74053: _low_level_execute_command(): starting 15494 1726853343.74056: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853343.4907944-16110-19273560372139/ /root/.ansible/tmp/ansible-tmp-1726853343.4907944-16110-19273560372139/AnsiballZ_ping.py && sleep 0' 15494 1726853343.74886: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853343.74952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853343.74969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853343.74992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853343.75050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853343.76945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853343.76949: stdout chunk (state=3): >>><<< 15494 1726853343.76951: stderr chunk (state=3): >>><<< 15494 1726853343.77079: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853343.77083: _low_level_execute_command(): starting 15494 1726853343.77086: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853343.4907944-16110-19273560372139/AnsiballZ_ping.py && sleep 0' 15494 1726853343.78008: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853343.78011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853343.78014: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 15494 1726853343.78017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853343.78019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853343.78065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853343.78089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853343.78133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853343.78190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853343.93354: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15494 1726853343.94670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853343.94676: stdout chunk (state=3): >>><<< 15494 1726853343.94685: stderr chunk (state=3): >>><<< 15494 1726853343.94699: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853343.94722: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853343.4907944-16110-19273560372139/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853343.94731: _low_level_execute_command(): starting 15494 1726853343.94736: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853343.4907944-16110-19273560372139/ > /dev/null 2>&1 && sleep 0' 15494 1726853343.95965: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853343.95968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853343.95972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853343.95975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853343.95977: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853343.95982: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853343.95983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853343.95985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853343.95987: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 15494 1726853343.95989: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15494 1726853343.95990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853343.95992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853343.95993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853343.95995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853343.96001: stderr chunk (state=3): >>>debug2: match found <<< 15494 1726853343.96003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853343.96005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853343.96006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853343.96008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853343.96054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853343.97940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853343.97943: stdout chunk (state=3): >>><<< 15494 1726853343.97952: stderr chunk (state=3): >>><<< 15494 1726853343.98105: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853343.98111: handler run complete 15494 1726853343.98126: attempt loop complete, returning result 15494 1726853343.98129: _execute() done 15494 1726853343.98132: dumping result to json 15494 1726853343.98134: done dumping result, returning 15494 1726853343.98145: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-0028-1a50-000000000029] 15494 1726853343.98152: sending task result for task 02083763-bbaf-0028-1a50-000000000029 ok: [managed_node1] => { "changed": false, "ping": "pong" } 15494 1726853343.98430: no more pending results, returning what we have 15494 1726853343.98434: results queue empty 15494 1726853343.98435: checking for any_errors_fatal 15494 1726853343.98442: done checking for any_errors_fatal 15494 1726853343.98443: checking for max_fail_percentage 15494 1726853343.98445: done checking for max_fail_percentage 15494 1726853343.98446: checking to see if all hosts have failed and the running result is not ok 15494 1726853343.98447: done checking to see if all hosts have failed 15494 1726853343.98447: getting the remaining hosts for this loop 15494 1726853343.98449: done getting the remaining hosts for this loop 15494 1726853343.98453: getting the next task for host managed_node1 15494 1726853343.98462: done getting next task for host managed_node1 15494 1726853343.98464: ^ task is: TASK: meta (role_complete) 15494 1726853343.98467: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853343.98478: getting variables 15494 1726853343.98480: in VariableManager get_vars() 15494 1726853343.98517: Calling all_inventory to load vars for managed_node1 15494 1726853343.98520: Calling groups_inventory to load vars for managed_node1 15494 1726853343.98522: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853343.98533: Calling all_plugins_play to load vars for managed_node1 15494 1726853343.98536: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853343.98540: Calling groups_plugins_play to load vars for managed_node1 15494 1726853343.99185: done sending task result for task 02083763-bbaf-0028-1a50-000000000029 15494 1726853343.99188: WORKER PROCESS EXITING 15494 1726853344.01427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853344.04837: done with get_vars() 15494 1726853344.04869: done getting variables 15494 1726853344.05080: done queuing things up, now waiting for results queue to drain 15494 1726853344.05082: results queue empty 15494 1726853344.05083: checking for any_errors_fatal 15494 1726853344.05086: done checking for any_errors_fatal 15494 1726853344.05087: checking for max_fail_percentage 15494 1726853344.05089: done checking for max_fail_percentage 15494 1726853344.05090: checking to see if all hosts have failed and the running result is not ok 15494 1726853344.05091: done checking to see if all hosts have failed 15494 1726853344.05092: getting the remaining hosts for this loop 15494 1726853344.05093: done getting the remaining hosts for this loop 15494 1726853344.05096: getting the next task for host managed_node1 15494 1726853344.05100: done getting next task for host managed_node1 15494 1726853344.05101: ^ task is: TASK: meta (flush_handlers) 15494 1726853344.05103: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853344.05106: getting variables 15494 1726853344.05107: in VariableManager get_vars() 15494 1726853344.05120: Calling all_inventory to load vars for managed_node1 15494 1726853344.05123: Calling groups_inventory to load vars for managed_node1 15494 1726853344.05125: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853344.05130: Calling all_plugins_play to load vars for managed_node1 15494 1726853344.05132: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853344.05135: Calling groups_plugins_play to load vars for managed_node1 15494 1726853344.07551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853344.10931: done with get_vars() 15494 1726853344.10953: done getting variables 15494 1726853344.11076: in VariableManager get_vars() 15494 1726853344.11092: Calling all_inventory to load vars for managed_node1 15494 1726853344.11094: Calling groups_inventory to load vars for managed_node1 15494 1726853344.11096: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853344.11101: Calling all_plugins_play to load vars for managed_node1 15494 1726853344.11103: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853344.11106: Calling groups_plugins_play to load vars for managed_node1 15494 1726853344.13455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853344.16670: done with get_vars() 15494 1726853344.16823: done queuing things up, now waiting for results queue to drain 15494 1726853344.16826: results queue empty 15494 1726853344.16826: checking for any_errors_fatal 15494 1726853344.16828: done checking for any_errors_fatal 15494 1726853344.16829: checking for max_fail_percentage 15494 1726853344.16830: done checking for max_fail_percentage 15494 1726853344.16831: checking to see if all hosts have failed and the running result is not ok 15494 1726853344.16831: done checking to see if all hosts have failed 15494 1726853344.16832: getting the remaining hosts for this loop 15494 1726853344.16839: done getting the remaining hosts for this loop 15494 1726853344.16843: getting the next task for host managed_node1 15494 1726853344.16847: done getting next task for host managed_node1 15494 1726853344.16848: ^ task is: TASK: meta (flush_handlers) 15494 1726853344.16850: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853344.16853: getting variables 15494 1726853344.16854: in VariableManager get_vars() 15494 1726853344.16867: Calling all_inventory to load vars for managed_node1 15494 1726853344.16869: Calling groups_inventory to load vars for managed_node1 15494 1726853344.16976: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853344.16982: Calling all_plugins_play to load vars for managed_node1 15494 1726853344.16985: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853344.16987: Calling groups_plugins_play to load vars for managed_node1 15494 1726853344.19614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853344.22962: done with get_vars() 15494 1726853344.22990: done getting variables 15494 1726853344.23042: in VariableManager get_vars() 15494 1726853344.23063: Calling all_inventory to load vars for managed_node1 15494 1726853344.23066: Calling groups_inventory to load vars for managed_node1 15494 1726853344.23068: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853344.23075: Calling all_plugins_play to load vars for managed_node1 15494 1726853344.23077: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853344.23080: Calling groups_plugins_play to load vars for managed_node1 15494 1726853344.25419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853344.27173: done with get_vars() 15494 1726853344.27210: done queuing things up, now waiting for results queue to drain 15494 1726853344.27213: results queue empty 15494 1726853344.27214: checking for any_errors_fatal 15494 1726853344.27215: done checking for any_errors_fatal 15494 1726853344.27216: checking for max_fail_percentage 15494 1726853344.27217: done checking for max_fail_percentage 15494 1726853344.27218: checking to see if all hosts have failed and the running result is not ok 15494 1726853344.27219: done checking to see if all hosts have failed 15494 1726853344.27219: getting the remaining hosts for this loop 15494 1726853344.27220: done getting the remaining hosts for this loop 15494 1726853344.27223: getting the next task for host managed_node1 15494 1726853344.27226: done getting next task for host managed_node1 15494 1726853344.27227: ^ task is: None 15494 1726853344.27229: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853344.27230: done queuing things up, now waiting for results queue to drain 15494 1726853344.27231: results queue empty 15494 1726853344.27231: checking for any_errors_fatal 15494 1726853344.27232: done checking for any_errors_fatal 15494 1726853344.27233: checking for max_fail_percentage 15494 1726853344.27234: done checking for max_fail_percentage 15494 1726853344.27235: checking to see if all hosts have failed and the running result is not ok 15494 1726853344.27235: done checking to see if all hosts have failed 15494 1726853344.27236: getting the next task for host managed_node1 15494 1726853344.27239: done getting next task for host managed_node1 15494 1726853344.27239: ^ task is: None 15494 1726853344.27241: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853344.27297: in VariableManager get_vars() 15494 1726853344.27314: done with get_vars() 15494 1726853344.27321: in VariableManager get_vars() 15494 1726853344.27331: done with get_vars() 15494 1726853344.27336: variable 'omit' from source: magic vars 15494 1726853344.27462: variable 'task' from source: play vars 15494 1726853344.27496: in VariableManager get_vars() 15494 1726853344.27512: done with get_vars() 15494 1726853344.27532: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_present.yml] ************************ 15494 1726853344.27718: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15494 1726853344.27745: getting the remaining hosts for this loop 15494 1726853344.27749: done getting the remaining hosts for this loop 15494 1726853344.27752: getting the next task for host managed_node1 15494 1726853344.27755: done getting next task for host managed_node1 15494 1726853344.27757: ^ task is: TASK: Gathering Facts 15494 1726853344.27758: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853344.27761: getting variables 15494 1726853344.27762: in VariableManager get_vars() 15494 1726853344.27770: Calling all_inventory to load vars for managed_node1 15494 1726853344.27775: Calling groups_inventory to load vars for managed_node1 15494 1726853344.27777: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853344.27783: Calling all_plugins_play to load vars for managed_node1 15494 1726853344.27785: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853344.27788: Calling groups_plugins_play to load vars for managed_node1 15494 1726853344.29560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853344.31762: done with get_vars() 15494 1726853344.31791: done getting variables 15494 1726853344.31837: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 13:29:04 -0400 (0:00:00.878) 0:00:12.934 ****** 15494 1726853344.31870: entering _queue_task() for managed_node1/gather_facts 15494 1726853344.32221: worker is 1 (out of 1 available) 15494 1726853344.32232: exiting _queue_task() for managed_node1/gather_facts 15494 1726853344.32243: done queuing things up, now waiting for results queue to drain 15494 1726853344.32244: waiting for pending results... 15494 1726853344.32541: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15494 1726853344.32723: in run() - task 02083763-bbaf-0028-1a50-000000000219 15494 1726853344.32727: variable 'ansible_search_path' from source: unknown 15494 1726853344.32730: calling self._execute() 15494 1726853344.32818: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853344.32837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853344.32853: variable 'omit' from source: magic vars 15494 1726853344.33263: variable 'ansible_distribution_major_version' from source: facts 15494 1726853344.33285: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853344.33296: variable 'omit' from source: magic vars 15494 1726853344.33375: variable 'omit' from source: magic vars 15494 1726853344.33378: variable 'omit' from source: magic vars 15494 1726853344.33420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853344.33478: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853344.33487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853344.33509: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853344.33587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853344.33590: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853344.33593: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853344.33597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853344.33686: Set connection var ansible_connection to ssh 15494 1726853344.33704: Set connection var ansible_pipelining to False 15494 1726853344.33721: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853344.33805: Set connection var ansible_shell_type to sh 15494 1726853344.33808: Set connection var ansible_timeout to 10 15494 1726853344.33811: Set connection var ansible_shell_executable to /bin/sh 15494 1726853344.33813: variable 'ansible_shell_executable' from source: unknown 15494 1726853344.33815: variable 'ansible_connection' from source: unknown 15494 1726853344.33819: variable 'ansible_module_compression' from source: unknown 15494 1726853344.33822: variable 'ansible_shell_type' from source: unknown 15494 1726853344.33826: variable 'ansible_shell_executable' from source: unknown 15494 1726853344.33827: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853344.33829: variable 'ansible_pipelining' from source: unknown 15494 1726853344.33831: variable 'ansible_timeout' from source: unknown 15494 1726853344.33833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853344.34034: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853344.34129: variable 'omit' from source: magic vars 15494 1726853344.34133: starting attempt loop 15494 1726853344.34135: running the handler 15494 1726853344.34137: variable 'ansible_facts' from source: unknown 15494 1726853344.34139: _low_level_execute_command(): starting 15494 1726853344.34141: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853344.35385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853344.35549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853344.35650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853344.37335: stdout chunk (state=3): >>>/root <<< 15494 1726853344.37582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853344.37586: stdout chunk (state=3): >>><<< 15494 1726853344.37588: stderr chunk (state=3): >>><<< 15494 1726853344.37592: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853344.37596: _low_level_execute_command(): starting 15494 1726853344.37598: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853344.3751-16145-230073608097059 `" && echo ansible-tmp-1726853344.3751-16145-230073608097059="` echo /root/.ansible/tmp/ansible-tmp-1726853344.3751-16145-230073608097059 `" ) && sleep 0' 15494 1726853344.38208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853344.38279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853344.38345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853344.38381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853344.38422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853344.38538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853344.40420: stdout chunk (state=3): >>>ansible-tmp-1726853344.3751-16145-230073608097059=/root/.ansible/tmp/ansible-tmp-1726853344.3751-16145-230073608097059 <<< 15494 1726853344.40590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853344.40594: stdout chunk (state=3): >>><<< 15494 1726853344.40597: stderr chunk (state=3): >>><<< 15494 1726853344.40622: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853344.3751-16145-230073608097059=/root/.ansible/tmp/ansible-tmp-1726853344.3751-16145-230073608097059 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853344.40776: variable 'ansible_module_compression' from source: unknown 15494 1726853344.40780: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15494 1726853344.40819: variable 'ansible_facts' from source: unknown 15494 1726853344.41051: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853344.3751-16145-230073608097059/AnsiballZ_setup.py 15494 1726853344.41190: Sending initial data 15494 1726853344.41253: Sent initial data (151 bytes) 15494 1726853344.41868: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853344.42006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853344.42022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853344.42044: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853344.42065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853344.42188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853344.43716: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853344.43778: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853344.43828: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpn1nkxnhz /root/.ansible/tmp/ansible-tmp-1726853344.3751-16145-230073608097059/AnsiballZ_setup.py <<< 15494 1726853344.43832: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853344.3751-16145-230073608097059/AnsiballZ_setup.py" <<< 15494 1726853344.43874: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpn1nkxnhz" to remote "/root/.ansible/tmp/ansible-tmp-1726853344.3751-16145-230073608097059/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853344.3751-16145-230073608097059/AnsiballZ_setup.py" <<< 15494 1726853344.46065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853344.46102: stderr chunk (state=3): >>><<< 15494 1726853344.46116: stdout chunk (state=3): >>><<< 15494 1726853344.46221: done transferring module to remote 15494 1726853344.46226: _low_level_execute_command(): starting 15494 1726853344.46228: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853344.3751-16145-230073608097059/ /root/.ansible/tmp/ansible-tmp-1726853344.3751-16145-230073608097059/AnsiballZ_setup.py && sleep 0' 15494 1726853344.46887: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853344.46929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853344.46949: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853344.46978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853344.47037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853344.49120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853344.49125: stdout chunk (state=3): >>><<< 15494 1726853344.49127: stderr chunk (state=3): >>><<< 15494 1726853344.49135: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853344.49137: _low_level_execute_command(): starting 15494 1726853344.49140: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853344.3751-16145-230073608097059/AnsiballZ_setup.py && sleep 0' 15494 1726853344.49915: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853344.49931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853344.49952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853344.49970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853344.49996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853344.50092: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853344.50118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853344.50134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853344.50227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853344.50425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853345.14815: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.72314453125, "5m": 0.38037109375, "15m": 0.1630859375}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "e<<< 15494 1726853345.14837: stdout chunk (state=3): >>>nforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2963, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 568, "free": 2963}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 511, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797081088, "block_size": 4096, "block_total": 65519099, "block_available": 63915303, "block_used": 1603796, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["eth0", "lo", "LSR-TST-br31"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_has<<< 15494 1726853345.14861: stdout chunk (state=3): >>>hing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "<<< 15494 1726853345.14877: stdout chunk (state=3): >>>timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "1e:33:24:48:88:76", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "05", "epoch": "1726853345", "epoch_int": "1726853345", "date": "2024-09-20", "time": "13:29:05", "iso8601_micro": "2024-09-20T17:29:05.144767Z", "iso8601": "2024-09-20T17:29:05Z", "iso8601_basic": "20240920T132905144767", "iso8601_basic_short": "20240920T132905", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15494 1726853345.16851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853345.16854: stdout chunk (state=3): >>><<< 15494 1726853345.16856: stderr chunk (state=3): >>><<< 15494 1726853345.16939: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.72314453125, "5m": 0.38037109375, "15m": 0.1630859375}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2963, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 568, "free": 2963}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 511, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797081088, "block_size": 4096, "block_total": 65519099, "block_available": 63915303, "block_used": 1603796, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["eth0", "lo", "LSR-TST-br31"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "1e:33:24:48:88:76", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "05", "epoch": "1726853345", "epoch_int": "1726853345", "date": "2024-09-20", "time": "13:29:05", "iso8601_micro": "2024-09-20T17:29:05.144767Z", "iso8601": "2024-09-20T17:29:05Z", "iso8601_basic": "20240920T132905144767", "iso8601_basic_short": "20240920T132905", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853345.17188: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853344.3751-16145-230073608097059/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853345.17228: _low_level_execute_command(): starting 15494 1726853345.17231: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853344.3751-16145-230073608097059/ > /dev/null 2>&1 && sleep 0' 15494 1726853345.17824: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853345.17828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853345.17830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15494 1726853345.17832: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853345.17834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853345.17896: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853345.17912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853345.17926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853345.18019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853345.19828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853345.19879: stderr chunk (state=3): >>><<< 15494 1726853345.19899: stdout chunk (state=3): >>><<< 15494 1726853345.19916: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853345.19925: handler run complete 15494 1726853345.20321: variable 'ansible_facts' from source: unknown 15494 1726853345.20357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853345.20743: variable 'ansible_facts' from source: unknown 15494 1726853345.20850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853345.21008: attempt loop complete, returning result 15494 1726853345.21019: _execute() done 15494 1726853345.21028: dumping result to json 15494 1726853345.21084: done dumping result, returning 15494 1726853345.21098: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-0028-1a50-000000000219] 15494 1726853345.21175: sending task result for task 02083763-bbaf-0028-1a50-000000000219 ok: [managed_node1] 15494 1726853345.22055: no more pending results, returning what we have 15494 1726853345.22058: results queue empty 15494 1726853345.22059: checking for any_errors_fatal 15494 1726853345.22060: done checking for any_errors_fatal 15494 1726853345.22061: checking for max_fail_percentage 15494 1726853345.22062: done checking for max_fail_percentage 15494 1726853345.22063: checking to see if all hosts have failed and the running result is not ok 15494 1726853345.22064: done checking to see if all hosts have failed 15494 1726853345.22069: getting the remaining hosts for this loop 15494 1726853345.22072: done getting the remaining hosts for this loop 15494 1726853345.22075: getting the next task for host managed_node1 15494 1726853345.22080: done getting next task for host managed_node1 15494 1726853345.22082: ^ task is: TASK: meta (flush_handlers) 15494 1726853345.22084: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853345.22087: getting variables 15494 1726853345.22089: in VariableManager get_vars() 15494 1726853345.22114: Calling all_inventory to load vars for managed_node1 15494 1726853345.22116: Calling groups_inventory to load vars for managed_node1 15494 1726853345.22119: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853345.22125: done sending task result for task 02083763-bbaf-0028-1a50-000000000219 15494 1726853345.22129: WORKER PROCESS EXITING 15494 1726853345.22138: Calling all_plugins_play to load vars for managed_node1 15494 1726853345.22141: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853345.22143: Calling groups_plugins_play to load vars for managed_node1 15494 1726853345.27580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853345.29105: done with get_vars() 15494 1726853345.29133: done getting variables 15494 1726853345.29195: in VariableManager get_vars() 15494 1726853345.29203: Calling all_inventory to load vars for managed_node1 15494 1726853345.29205: Calling groups_inventory to load vars for managed_node1 15494 1726853345.29207: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853345.29210: Calling all_plugins_play to load vars for managed_node1 15494 1726853345.29212: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853345.29213: Calling groups_plugins_play to load vars for managed_node1 15494 1726853345.30006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853345.31084: done with get_vars() 15494 1726853345.31105: done queuing things up, now waiting for results queue to drain 15494 1726853345.31107: results queue empty 15494 1726853345.31108: checking for any_errors_fatal 15494 1726853345.31110: done checking for any_errors_fatal 15494 1726853345.31111: checking for max_fail_percentage 15494 1726853345.31115: done checking for max_fail_percentage 15494 1726853345.31116: checking to see if all hosts have failed and the running result is not ok 15494 1726853345.31116: done checking to see if all hosts have failed 15494 1726853345.31117: getting the remaining hosts for this loop 15494 1726853345.31117: done getting the remaining hosts for this loop 15494 1726853345.31119: getting the next task for host managed_node1 15494 1726853345.31122: done getting next task for host managed_node1 15494 1726853345.31124: ^ task is: TASK: Include the task '{{ task }}' 15494 1726853345.31125: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853345.31126: getting variables 15494 1726853345.31127: in VariableManager get_vars() 15494 1726853345.31133: Calling all_inventory to load vars for managed_node1 15494 1726853345.31135: Calling groups_inventory to load vars for managed_node1 15494 1726853345.31136: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853345.31140: Calling all_plugins_play to load vars for managed_node1 15494 1726853345.31142: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853345.31143: Calling groups_plugins_play to load vars for managed_node1 15494 1726853345.31872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853345.33313: done with get_vars() 15494 1726853345.33339: done getting variables 15494 1726853345.33528: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_present.yml'] ********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 13:29:05 -0400 (0:00:01.016) 0:00:13.951 ****** 15494 1726853345.33554: entering _queue_task() for managed_node1/include_tasks 15494 1726853345.34178: worker is 1 (out of 1 available) 15494 1726853345.34195: exiting _queue_task() for managed_node1/include_tasks 15494 1726853345.34207: done queuing things up, now waiting for results queue to drain 15494 1726853345.34208: waiting for pending results... 15494 1726853345.34381: running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_device_present.yml' 15494 1726853345.34499: in run() - task 02083763-bbaf-0028-1a50-00000000002d 15494 1726853345.34677: variable 'ansible_search_path' from source: unknown 15494 1726853345.34681: calling self._execute() 15494 1726853345.34695: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853345.34715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853345.34719: variable 'omit' from source: magic vars 15494 1726853345.35103: variable 'ansible_distribution_major_version' from source: facts 15494 1726853345.35107: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853345.35110: variable 'task' from source: play vars 15494 1726853345.35159: variable 'task' from source: play vars 15494 1726853345.35191: _execute() done 15494 1726853345.35198: dumping result to json 15494 1726853345.35201: done dumping result, returning 15494 1726853345.35204: done running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_device_present.yml' [02083763-bbaf-0028-1a50-00000000002d] 15494 1726853345.35206: sending task result for task 02083763-bbaf-0028-1a50-00000000002d 15494 1726853345.35466: done sending task result for task 02083763-bbaf-0028-1a50-00000000002d 15494 1726853345.35474: WORKER PROCESS EXITING 15494 1726853345.35505: no more pending results, returning what we have 15494 1726853345.35511: in VariableManager get_vars() 15494 1726853345.35544: Calling all_inventory to load vars for managed_node1 15494 1726853345.35549: Calling groups_inventory to load vars for managed_node1 15494 1726853345.35552: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853345.35565: Calling all_plugins_play to load vars for managed_node1 15494 1726853345.35568: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853345.35572: Calling groups_plugins_play to load vars for managed_node1 15494 1726853345.36858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853345.38707: done with get_vars() 15494 1726853345.38731: variable 'ansible_search_path' from source: unknown 15494 1726853345.38754: we have included files to process 15494 1726853345.38755: generating all_blocks data 15494 1726853345.38756: done generating all_blocks data 15494 1726853345.38757: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15494 1726853345.38758: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15494 1726853345.38761: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15494 1726853345.38908: in VariableManager get_vars() 15494 1726853345.38920: done with get_vars() 15494 1726853345.39008: done processing included file 15494 1726853345.39009: iterating over new_blocks loaded from include file 15494 1726853345.39010: in VariableManager get_vars() 15494 1726853345.39018: done with get_vars() 15494 1726853345.39019: filtering new block on tags 15494 1726853345.39035: done filtering new block on tags 15494 1726853345.39037: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node1 15494 1726853345.39041: extending task lists for all hosts with included blocks 15494 1726853345.39062: done extending task lists 15494 1726853345.39062: done processing included files 15494 1726853345.39063: results queue empty 15494 1726853345.39063: checking for any_errors_fatal 15494 1726853345.39064: done checking for any_errors_fatal 15494 1726853345.39065: checking for max_fail_percentage 15494 1726853345.39065: done checking for max_fail_percentage 15494 1726853345.39066: checking to see if all hosts have failed and the running result is not ok 15494 1726853345.39066: done checking to see if all hosts have failed 15494 1726853345.39067: getting the remaining hosts for this loop 15494 1726853345.39068: done getting the remaining hosts for this loop 15494 1726853345.39069: getting the next task for host managed_node1 15494 1726853345.39073: done getting next task for host managed_node1 15494 1726853345.39075: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15494 1726853345.39076: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853345.39077: getting variables 15494 1726853345.39078: in VariableManager get_vars() 15494 1726853345.39084: Calling all_inventory to load vars for managed_node1 15494 1726853345.39085: Calling groups_inventory to load vars for managed_node1 15494 1726853345.39087: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853345.39091: Calling all_plugins_play to load vars for managed_node1 15494 1726853345.39093: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853345.39095: Calling groups_plugins_play to load vars for managed_node1 15494 1726853345.40011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853345.41391: done with get_vars() 15494 1726853345.41410: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:29:05 -0400 (0:00:00.079) 0:00:14.030 ****** 15494 1726853345.41501: entering _queue_task() for managed_node1/include_tasks 15494 1726853345.41849: worker is 1 (out of 1 available) 15494 1726853345.41861: exiting _queue_task() for managed_node1/include_tasks 15494 1726853345.42076: done queuing things up, now waiting for results queue to drain 15494 1726853345.42078: waiting for pending results... 15494 1726853345.42161: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 15494 1726853345.42412: in run() - task 02083763-bbaf-0028-1a50-00000000022a 15494 1726853345.42416: variable 'ansible_search_path' from source: unknown 15494 1726853345.42421: variable 'ansible_search_path' from source: unknown 15494 1726853345.42424: calling self._execute() 15494 1726853345.42485: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853345.42500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853345.42524: variable 'omit' from source: magic vars 15494 1726853345.42968: variable 'ansible_distribution_major_version' from source: facts 15494 1726853345.42989: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853345.43005: _execute() done 15494 1726853345.43018: dumping result to json 15494 1726853345.43032: done dumping result, returning 15494 1726853345.43050: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-0028-1a50-00000000022a] 15494 1726853345.43061: sending task result for task 02083763-bbaf-0028-1a50-00000000022a 15494 1726853345.43394: done sending task result for task 02083763-bbaf-0028-1a50-00000000022a 15494 1726853345.43398: WORKER PROCESS EXITING 15494 1726853345.43431: no more pending results, returning what we have 15494 1726853345.43439: in VariableManager get_vars() 15494 1726853345.43474: Calling all_inventory to load vars for managed_node1 15494 1726853345.43481: Calling groups_inventory to load vars for managed_node1 15494 1726853345.43485: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853345.43500: Calling all_plugins_play to load vars for managed_node1 15494 1726853345.43505: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853345.43509: Calling groups_plugins_play to load vars for managed_node1 15494 1726853345.44987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853345.46653: done with get_vars() 15494 1726853345.46672: variable 'ansible_search_path' from source: unknown 15494 1726853345.46674: variable 'ansible_search_path' from source: unknown 15494 1726853345.46684: variable 'task' from source: play vars 15494 1726853345.46789: variable 'task' from source: play vars 15494 1726853345.46827: we have included files to process 15494 1726853345.46828: generating all_blocks data 15494 1726853345.46830: done generating all_blocks data 15494 1726853345.46831: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15494 1726853345.46832: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15494 1726853345.46835: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15494 1726853345.47021: done processing included file 15494 1726853345.47024: iterating over new_blocks loaded from include file 15494 1726853345.47026: in VariableManager get_vars() 15494 1726853345.47039: done with get_vars() 15494 1726853345.47040: filtering new block on tags 15494 1726853345.47056: done filtering new block on tags 15494 1726853345.47058: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 15494 1726853345.47063: extending task lists for all hosts with included blocks 15494 1726853345.47165: done extending task lists 15494 1726853345.47167: done processing included files 15494 1726853345.47168: results queue empty 15494 1726853345.47168: checking for any_errors_fatal 15494 1726853345.47173: done checking for any_errors_fatal 15494 1726853345.47174: checking for max_fail_percentage 15494 1726853345.47175: done checking for max_fail_percentage 15494 1726853345.47176: checking to see if all hosts have failed and the running result is not ok 15494 1726853345.47177: done checking to see if all hosts have failed 15494 1726853345.47177: getting the remaining hosts for this loop 15494 1726853345.47179: done getting the remaining hosts for this loop 15494 1726853345.47181: getting the next task for host managed_node1 15494 1726853345.47186: done getting next task for host managed_node1 15494 1726853345.47188: ^ task is: TASK: Get stat for interface {{ interface }} 15494 1726853345.47190: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853345.47192: getting variables 15494 1726853345.47193: in VariableManager get_vars() 15494 1726853345.47201: Calling all_inventory to load vars for managed_node1 15494 1726853345.47204: Calling groups_inventory to load vars for managed_node1 15494 1726853345.47206: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853345.47211: Calling all_plugins_play to load vars for managed_node1 15494 1726853345.47214: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853345.47217: Calling groups_plugins_play to load vars for managed_node1 15494 1726853345.48308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853345.49813: done with get_vars() 15494 1726853345.49834: done getting variables 15494 1726853345.49956: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:29:05 -0400 (0:00:00.084) 0:00:14.115 ****** 15494 1726853345.49988: entering _queue_task() for managed_node1/stat 15494 1726853345.50325: worker is 1 (out of 1 available) 15494 1726853345.50336: exiting _queue_task() for managed_node1/stat 15494 1726853345.50349: done queuing things up, now waiting for results queue to drain 15494 1726853345.50350: waiting for pending results... 15494 1726853345.50630: running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 15494 1726853345.50754: in run() - task 02083763-bbaf-0028-1a50-000000000235 15494 1726853345.50779: variable 'ansible_search_path' from source: unknown 15494 1726853345.50787: variable 'ansible_search_path' from source: unknown 15494 1726853345.50831: calling self._execute() 15494 1726853345.50930: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853345.50944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853345.50962: variable 'omit' from source: magic vars 15494 1726853345.51342: variable 'ansible_distribution_major_version' from source: facts 15494 1726853345.51361: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853345.51376: variable 'omit' from source: magic vars 15494 1726853345.51427: variable 'omit' from source: magic vars 15494 1726853345.51578: variable 'interface' from source: set_fact 15494 1726853345.51582: variable 'omit' from source: magic vars 15494 1726853345.51601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853345.51640: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853345.51668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853345.51699: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853345.51778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853345.51781: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853345.51784: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853345.51787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853345.51880: Set connection var ansible_connection to ssh 15494 1726853345.51891: Set connection var ansible_pipelining to False 15494 1726853345.51902: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853345.51915: Set connection var ansible_shell_type to sh 15494 1726853345.51925: Set connection var ansible_timeout to 10 15494 1726853345.51936: Set connection var ansible_shell_executable to /bin/sh 15494 1726853345.51966: variable 'ansible_shell_executable' from source: unknown 15494 1726853345.51980: variable 'ansible_connection' from source: unknown 15494 1726853345.51992: variable 'ansible_module_compression' from source: unknown 15494 1726853345.52000: variable 'ansible_shell_type' from source: unknown 15494 1726853345.52021: variable 'ansible_shell_executable' from source: unknown 15494 1726853345.52026: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853345.52030: variable 'ansible_pipelining' from source: unknown 15494 1726853345.52033: variable 'ansible_timeout' from source: unknown 15494 1726853345.52176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853345.52257: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853345.52278: variable 'omit' from source: magic vars 15494 1726853345.52296: starting attempt loop 15494 1726853345.52304: running the handler 15494 1726853345.52323: _low_level_execute_command(): starting 15494 1726853345.52336: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853345.53033: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853345.53132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853345.53177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853345.54852: stdout chunk (state=3): >>>/root <<< 15494 1726853345.55018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853345.55022: stdout chunk (state=3): >>><<< 15494 1726853345.55033: stderr chunk (state=3): >>><<< 15494 1726853345.55082: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853345.55181: _low_level_execute_command(): starting 15494 1726853345.55185: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853345.5509312-16188-52979951479317 `" && echo ansible-tmp-1726853345.5509312-16188-52979951479317="` echo /root/.ansible/tmp/ansible-tmp-1726853345.5509312-16188-52979951479317 `" ) && sleep 0' 15494 1726853345.55692: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853345.55706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 15494 1726853345.55717: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853345.55774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853345.55778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853345.55822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853345.57706: stdout chunk (state=3): >>>ansible-tmp-1726853345.5509312-16188-52979951479317=/root/.ansible/tmp/ansible-tmp-1726853345.5509312-16188-52979951479317 <<< 15494 1726853345.57833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853345.57844: stderr chunk (state=3): >>><<< 15494 1726853345.57853: stdout chunk (state=3): >>><<< 15494 1726853345.57879: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853345.5509312-16188-52979951479317=/root/.ansible/tmp/ansible-tmp-1726853345.5509312-16188-52979951479317 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853345.58076: variable 'ansible_module_compression' from source: unknown 15494 1726853345.58079: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15494 1726853345.58081: variable 'ansible_facts' from source: unknown 15494 1726853345.58113: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853345.5509312-16188-52979951479317/AnsiballZ_stat.py 15494 1726853345.58330: Sending initial data 15494 1726853345.58341: Sent initial data (152 bytes) 15494 1726853345.58844: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853345.58859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853345.58878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853345.58979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853345.59005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853345.59083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853345.60605: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15494 1726853345.60628: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853345.60687: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853345.60730: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpx48qboyk /root/.ansible/tmp/ansible-tmp-1726853345.5509312-16188-52979951479317/AnsiballZ_stat.py <<< 15494 1726853345.60759: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853345.5509312-16188-52979951479317/AnsiballZ_stat.py" <<< 15494 1726853345.60811: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpx48qboyk" to remote "/root/.ansible/tmp/ansible-tmp-1726853345.5509312-16188-52979951479317/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853345.5509312-16188-52979951479317/AnsiballZ_stat.py" <<< 15494 1726853345.62725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853345.62882: stderr chunk (state=3): >>><<< 15494 1726853345.62885: stdout chunk (state=3): >>><<< 15494 1726853345.62887: done transferring module to remote 15494 1726853345.62889: _low_level_execute_command(): starting 15494 1726853345.62963: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853345.5509312-16188-52979951479317/ /root/.ansible/tmp/ansible-tmp-1726853345.5509312-16188-52979951479317/AnsiballZ_stat.py && sleep 0' 15494 1726853345.63558: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853345.63632: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853345.63702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853345.63721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853345.63749: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853345.63841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853345.65763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853345.65773: stdout chunk (state=3): >>><<< 15494 1726853345.65781: stderr chunk (state=3): >>><<< 15494 1726853345.65784: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853345.65787: _low_level_execute_command(): starting 15494 1726853345.65789: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853345.5509312-16188-52979951479317/AnsiballZ_stat.py && sleep 0' 15494 1726853345.66459: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853345.66687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853345.66691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853345.66718: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853345.66741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853345.67190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853345.67263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853345.82521: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27671, "dev": 23, "nlink": 1, "atime": 1726853343.0270474, "mtime": 1726853343.0270474, "ctime": 1726853343.0270474, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15494 1726853345.83964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853345.83978: stdout chunk (state=3): >>><<< 15494 1726853345.83987: stderr chunk (state=3): >>><<< 15494 1726853345.84217: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27671, "dev": 23, "nlink": 1, "atime": 1726853343.0270474, "mtime": 1726853343.0270474, "ctime": 1726853343.0270474, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853345.84221: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853345.5509312-16188-52979951479317/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853345.84224: _low_level_execute_command(): starting 15494 1726853345.84227: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853345.5509312-16188-52979951479317/ > /dev/null 2>&1 && sleep 0' 15494 1726853345.85728: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853345.85787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853345.85928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853345.86043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853345.86157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853345.88050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853345.88054: stdout chunk (state=3): >>><<< 15494 1726853345.88060: stderr chunk (state=3): >>><<< 15494 1726853345.88084: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853345.88091: handler run complete 15494 1726853345.88141: attempt loop complete, returning result 15494 1726853345.88144: _execute() done 15494 1726853345.88149: dumping result to json 15494 1726853345.88151: done dumping result, returning 15494 1726853345.88176: done running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 [02083763-bbaf-0028-1a50-000000000235] 15494 1726853345.88180: sending task result for task 02083763-bbaf-0028-1a50-000000000235 15494 1726853345.88289: done sending task result for task 02083763-bbaf-0028-1a50-000000000235 15494 1726853345.88292: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726853343.0270474, "block_size": 4096, "blocks": 0, "ctime": 1726853343.0270474, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27671, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "mode": "0777", "mtime": 1726853343.0270474, "nlink": 1, "path": "/sys/class/net/LSR-TST-br31", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 15494 1726853345.88473: no more pending results, returning what we have 15494 1726853345.88478: results queue empty 15494 1726853345.88479: checking for any_errors_fatal 15494 1726853345.88480: done checking for any_errors_fatal 15494 1726853345.88481: checking for max_fail_percentage 15494 1726853345.88575: done checking for max_fail_percentage 15494 1726853345.88576: checking to see if all hosts have failed and the running result is not ok 15494 1726853345.88577: done checking to see if all hosts have failed 15494 1726853345.88577: getting the remaining hosts for this loop 15494 1726853345.88578: done getting the remaining hosts for this loop 15494 1726853345.88582: getting the next task for host managed_node1 15494 1726853345.88589: done getting next task for host managed_node1 15494 1726853345.88592: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 15494 1726853345.88595: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853345.88598: getting variables 15494 1726853345.88599: in VariableManager get_vars() 15494 1726853345.88624: Calling all_inventory to load vars for managed_node1 15494 1726853345.88627: Calling groups_inventory to load vars for managed_node1 15494 1726853345.88629: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853345.88638: Calling all_plugins_play to load vars for managed_node1 15494 1726853345.88640: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853345.88643: Calling groups_plugins_play to load vars for managed_node1 15494 1726853345.90512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853345.92815: done with get_vars() 15494 1726853345.92847: done getting variables 15494 1726853345.92936: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15494 1726853345.93080: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'LSR-TST-br31'] ******************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:29:05 -0400 (0:00:00.431) 0:00:14.546 ****** 15494 1726853345.93111: entering _queue_task() for managed_node1/assert 15494 1726853345.93583: worker is 1 (out of 1 available) 15494 1726853345.93594: exiting _queue_task() for managed_node1/assert 15494 1726853345.93604: done queuing things up, now waiting for results queue to drain 15494 1726853345.93606: waiting for pending results... 15494 1726853345.93843: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'LSR-TST-br31' 15494 1726853345.93922: in run() - task 02083763-bbaf-0028-1a50-00000000022b 15494 1726853345.94005: variable 'ansible_search_path' from source: unknown 15494 1726853345.94008: variable 'ansible_search_path' from source: unknown 15494 1726853345.94012: calling self._execute() 15494 1726853345.94082: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853345.94094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853345.94109: variable 'omit' from source: magic vars 15494 1726853345.94677: variable 'ansible_distribution_major_version' from source: facts 15494 1726853345.94682: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853345.94685: variable 'omit' from source: magic vars 15494 1726853345.94812: variable 'omit' from source: magic vars 15494 1726853345.94913: variable 'interface' from source: set_fact 15494 1726853345.94947: variable 'omit' from source: magic vars 15494 1726853345.95143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853345.95147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853345.95149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853345.95267: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853345.95286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853345.95322: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853345.95368: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853345.95381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853345.95602: Set connection var ansible_connection to ssh 15494 1726853345.95615: Set connection var ansible_pipelining to False 15494 1726853345.95777: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853345.95781: Set connection var ansible_shell_type to sh 15494 1726853345.95783: Set connection var ansible_timeout to 10 15494 1726853345.95785: Set connection var ansible_shell_executable to /bin/sh 15494 1726853345.95787: variable 'ansible_shell_executable' from source: unknown 15494 1726853345.95791: variable 'ansible_connection' from source: unknown 15494 1726853345.95794: variable 'ansible_module_compression' from source: unknown 15494 1726853345.95796: variable 'ansible_shell_type' from source: unknown 15494 1726853345.95798: variable 'ansible_shell_executable' from source: unknown 15494 1726853345.95800: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853345.95802: variable 'ansible_pipelining' from source: unknown 15494 1726853345.95805: variable 'ansible_timeout' from source: unknown 15494 1726853345.95806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853345.96194: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853345.96198: variable 'omit' from source: magic vars 15494 1726853345.96200: starting attempt loop 15494 1726853345.96203: running the handler 15494 1726853345.96576: variable 'interface_stat' from source: set_fact 15494 1726853345.96580: Evaluated conditional (interface_stat.stat.exists): True 15494 1726853345.96582: handler run complete 15494 1726853345.96599: attempt loop complete, returning result 15494 1726853345.96676: _execute() done 15494 1726853345.96679: dumping result to json 15494 1726853345.96681: done dumping result, returning 15494 1726853345.96684: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'LSR-TST-br31' [02083763-bbaf-0028-1a50-00000000022b] 15494 1726853345.96686: sending task result for task 02083763-bbaf-0028-1a50-00000000022b ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15494 1726853345.97019: no more pending results, returning what we have 15494 1726853345.97025: results queue empty 15494 1726853345.97026: checking for any_errors_fatal 15494 1726853345.97038: done checking for any_errors_fatal 15494 1726853345.97039: checking for max_fail_percentage 15494 1726853345.97041: done checking for max_fail_percentage 15494 1726853345.97042: checking to see if all hosts have failed and the running result is not ok 15494 1726853345.97046: done checking to see if all hosts have failed 15494 1726853345.97047: getting the remaining hosts for this loop 15494 1726853345.97049: done getting the remaining hosts for this loop 15494 1726853345.97053: getting the next task for host managed_node1 15494 1726853345.97064: done getting next task for host managed_node1 15494 1726853345.97066: ^ task is: TASK: meta (flush_handlers) 15494 1726853345.97068: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853345.97277: getting variables 15494 1726853345.97281: in VariableManager get_vars() 15494 1726853345.97317: Calling all_inventory to load vars for managed_node1 15494 1726853345.97320: Calling groups_inventory to load vars for managed_node1 15494 1726853345.97324: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853345.97356: Calling all_plugins_play to load vars for managed_node1 15494 1726853345.97360: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853345.97363: Calling groups_plugins_play to load vars for managed_node1 15494 1726853345.97950: done sending task result for task 02083763-bbaf-0028-1a50-00000000022b 15494 1726853345.97954: WORKER PROCESS EXITING 15494 1726853346.00232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853346.02701: done with get_vars() 15494 1726853346.02726: done getting variables 15494 1726853346.02808: in VariableManager get_vars() 15494 1726853346.02818: Calling all_inventory to load vars for managed_node1 15494 1726853346.02821: Calling groups_inventory to load vars for managed_node1 15494 1726853346.02823: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853346.02828: Calling all_plugins_play to load vars for managed_node1 15494 1726853346.02831: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853346.02834: Calling groups_plugins_play to load vars for managed_node1 15494 1726853346.04129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853346.05856: done with get_vars() 15494 1726853346.05892: done queuing things up, now waiting for results queue to drain 15494 1726853346.05894: results queue empty 15494 1726853346.05895: checking for any_errors_fatal 15494 1726853346.05898: done checking for any_errors_fatal 15494 1726853346.05899: checking for max_fail_percentage 15494 1726853346.05900: done checking for max_fail_percentage 15494 1726853346.05901: checking to see if all hosts have failed and the running result is not ok 15494 1726853346.05902: done checking to see if all hosts have failed 15494 1726853346.05908: getting the remaining hosts for this loop 15494 1726853346.05909: done getting the remaining hosts for this loop 15494 1726853346.05912: getting the next task for host managed_node1 15494 1726853346.05916: done getting next task for host managed_node1 15494 1726853346.05917: ^ task is: TASK: meta (flush_handlers) 15494 1726853346.05919: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853346.05922: getting variables 15494 1726853346.05923: in VariableManager get_vars() 15494 1726853346.05932: Calling all_inventory to load vars for managed_node1 15494 1726853346.05934: Calling groups_inventory to load vars for managed_node1 15494 1726853346.05936: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853346.05941: Calling all_plugins_play to load vars for managed_node1 15494 1726853346.05944: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853346.05947: Calling groups_plugins_play to load vars for managed_node1 15494 1726853346.07104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853346.08101: done with get_vars() 15494 1726853346.08115: done getting variables 15494 1726853346.08151: in VariableManager get_vars() 15494 1726853346.08158: Calling all_inventory to load vars for managed_node1 15494 1726853346.08161: Calling groups_inventory to load vars for managed_node1 15494 1726853346.08163: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853346.08166: Calling all_plugins_play to load vars for managed_node1 15494 1726853346.08167: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853346.08169: Calling groups_plugins_play to load vars for managed_node1 15494 1726853346.08785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853346.10042: done with get_vars() 15494 1726853346.10068: done queuing things up, now waiting for results queue to drain 15494 1726853346.10072: results queue empty 15494 1726853346.10073: checking for any_errors_fatal 15494 1726853346.10074: done checking for any_errors_fatal 15494 1726853346.10075: checking for max_fail_percentage 15494 1726853346.10076: done checking for max_fail_percentage 15494 1726853346.10077: checking to see if all hosts have failed and the running result is not ok 15494 1726853346.10077: done checking to see if all hosts have failed 15494 1726853346.10078: getting the remaining hosts for this loop 15494 1726853346.10079: done getting the remaining hosts for this loop 15494 1726853346.10081: getting the next task for host managed_node1 15494 1726853346.10085: done getting next task for host managed_node1 15494 1726853346.10086: ^ task is: None 15494 1726853346.10087: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853346.10088: done queuing things up, now waiting for results queue to drain 15494 1726853346.10089: results queue empty 15494 1726853346.10090: checking for any_errors_fatal 15494 1726853346.10091: done checking for any_errors_fatal 15494 1726853346.10091: checking for max_fail_percentage 15494 1726853346.10092: done checking for max_fail_percentage 15494 1726853346.10093: checking to see if all hosts have failed and the running result is not ok 15494 1726853346.10093: done checking to see if all hosts have failed 15494 1726853346.10095: getting the next task for host managed_node1 15494 1726853346.10097: done getting next task for host managed_node1 15494 1726853346.10097: ^ task is: None 15494 1726853346.10098: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853346.10142: in VariableManager get_vars() 15494 1726853346.10159: done with get_vars() 15494 1726853346.10165: in VariableManager get_vars() 15494 1726853346.10178: done with get_vars() 15494 1726853346.10185: variable 'omit' from source: magic vars 15494 1726853346.10283: variable 'task' from source: play vars 15494 1726853346.10304: in VariableManager get_vars() 15494 1726853346.10311: done with get_vars() 15494 1726853346.10324: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_present.yml] *********************** 15494 1726853346.10484: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15494 1726853346.10503: getting the remaining hosts for this loop 15494 1726853346.10504: done getting the remaining hosts for this loop 15494 1726853346.10506: getting the next task for host managed_node1 15494 1726853346.10508: done getting next task for host managed_node1 15494 1726853346.10509: ^ task is: TASK: Gathering Facts 15494 1726853346.10510: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853346.10511: getting variables 15494 1726853346.10512: in VariableManager get_vars() 15494 1726853346.10517: Calling all_inventory to load vars for managed_node1 15494 1726853346.10519: Calling groups_inventory to load vars for managed_node1 15494 1726853346.10520: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853346.10524: Calling all_plugins_play to load vars for managed_node1 15494 1726853346.10525: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853346.10527: Calling groups_plugins_play to load vars for managed_node1 15494 1726853346.11247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853346.12116: done with get_vars() 15494 1726853346.12132: done getting variables 15494 1726853346.12166: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 13:29:06 -0400 (0:00:00.190) 0:00:14.737 ****** 15494 1726853346.12187: entering _queue_task() for managed_node1/gather_facts 15494 1726853346.12479: worker is 1 (out of 1 available) 15494 1726853346.12489: exiting _queue_task() for managed_node1/gather_facts 15494 1726853346.12501: done queuing things up, now waiting for results queue to drain 15494 1726853346.12504: waiting for pending results... 15494 1726853346.12889: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15494 1726853346.12894: in run() - task 02083763-bbaf-0028-1a50-00000000024e 15494 1726853346.12916: variable 'ansible_search_path' from source: unknown 15494 1726853346.12958: calling self._execute() 15494 1726853346.13061: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853346.13074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853346.13090: variable 'omit' from source: magic vars 15494 1726853346.13488: variable 'ansible_distribution_major_version' from source: facts 15494 1726853346.13505: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853346.13516: variable 'omit' from source: magic vars 15494 1726853346.13558: variable 'omit' from source: magic vars 15494 1726853346.13642: variable 'omit' from source: magic vars 15494 1726853346.13654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853346.13697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853346.13725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853346.13755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853346.13775: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853346.13809: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853346.13861: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853346.13864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853346.13937: Set connection var ansible_connection to ssh 15494 1726853346.13951: Set connection var ansible_pipelining to False 15494 1726853346.13965: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853346.13977: Set connection var ansible_shell_type to sh 15494 1726853346.13988: Set connection var ansible_timeout to 10 15494 1726853346.14078: Set connection var ansible_shell_executable to /bin/sh 15494 1726853346.14081: variable 'ansible_shell_executable' from source: unknown 15494 1726853346.14083: variable 'ansible_connection' from source: unknown 15494 1726853346.14085: variable 'ansible_module_compression' from source: unknown 15494 1726853346.14087: variable 'ansible_shell_type' from source: unknown 15494 1726853346.14089: variable 'ansible_shell_executable' from source: unknown 15494 1726853346.14091: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853346.14093: variable 'ansible_pipelining' from source: unknown 15494 1726853346.14095: variable 'ansible_timeout' from source: unknown 15494 1726853346.14098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853346.14258: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853346.14275: variable 'omit' from source: magic vars 15494 1726853346.14285: starting attempt loop 15494 1726853346.14296: running the handler 15494 1726853346.14317: variable 'ansible_facts' from source: unknown 15494 1726853346.14340: _low_level_execute_command(): starting 15494 1726853346.14356: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853346.15139: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853346.15175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853346.15277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853346.15295: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853346.15310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853346.15331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853346.15410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853346.17102: stdout chunk (state=3): >>>/root <<< 15494 1726853346.17252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853346.17265: stdout chunk (state=3): >>><<< 15494 1726853346.17288: stderr chunk (state=3): >>><<< 15494 1726853346.17316: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853346.17337: _low_level_execute_command(): starting 15494 1726853346.17433: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853346.1732385-16218-260562902523489 `" && echo ansible-tmp-1726853346.1732385-16218-260562902523489="` echo /root/.ansible/tmp/ansible-tmp-1726853346.1732385-16218-260562902523489 `" ) && sleep 0' 15494 1726853346.18026: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853346.18038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853346.18086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853346.18120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853346.18210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853346.18237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853346.18318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853346.20198: stdout chunk (state=3): >>>ansible-tmp-1726853346.1732385-16218-260562902523489=/root/.ansible/tmp/ansible-tmp-1726853346.1732385-16218-260562902523489 <<< 15494 1726853346.20364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853346.20368: stdout chunk (state=3): >>><<< 15494 1726853346.20370: stderr chunk (state=3): >>><<< 15494 1726853346.20392: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853346.1732385-16218-260562902523489=/root/.ansible/tmp/ansible-tmp-1726853346.1732385-16218-260562902523489 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853346.20430: variable 'ansible_module_compression' from source: unknown 15494 1726853346.20580: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15494 1726853346.20583: variable 'ansible_facts' from source: unknown 15494 1726853346.20803: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853346.1732385-16218-260562902523489/AnsiballZ_setup.py 15494 1726853346.20998: Sending initial data 15494 1726853346.21001: Sent initial data (154 bytes) 15494 1726853346.21640: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853346.21658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853346.21766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853346.21795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853346.21869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853346.23463: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853346.23523: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853346.23563: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp5tlqpnmd /root/.ansible/tmp/ansible-tmp-1726853346.1732385-16218-260562902523489/AnsiballZ_setup.py <<< 15494 1726853346.23567: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853346.1732385-16218-260562902523489/AnsiballZ_setup.py" <<< 15494 1726853346.23617: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp5tlqpnmd" to remote "/root/.ansible/tmp/ansible-tmp-1726853346.1732385-16218-260562902523489/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853346.1732385-16218-260562902523489/AnsiballZ_setup.py" <<< 15494 1726853346.25183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853346.25186: stdout chunk (state=3): >>><<< 15494 1726853346.25189: stderr chunk (state=3): >>><<< 15494 1726853346.25191: done transferring module to remote 15494 1726853346.25204: _low_level_execute_command(): starting 15494 1726853346.25213: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853346.1732385-16218-260562902523489/ /root/.ansible/tmp/ansible-tmp-1726853346.1732385-16218-260562902523489/AnsiballZ_setup.py && sleep 0' 15494 1726853346.25909: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853346.25922: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853346.25942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853346.26076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853346.26088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853346.26172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853346.28074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853346.28084: stdout chunk (state=3): >>><<< 15494 1726853346.28094: stderr chunk (state=3): >>><<< 15494 1726853346.28112: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853346.28129: _low_level_execute_command(): starting 15494 1726853346.28138: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853346.1732385-16218-260562902523489/AnsiballZ_setup.py && sleep 0' 15494 1726853346.28775: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853346.28788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853346.28803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853346.28838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853346.28858: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853346.28951: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853346.28975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853346.29060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853346.92849: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.72314453125, "5m": 0.38037109375, "15m": 0.1630859375}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "06", "epoch": "1726853346", "epoch_int": "1726853346", "date": "2024-09-20", "time": "13:29:06", "iso8601_micro": "2024-09-20T17:29:06.567743Z", "iso8601": "2024-09-20T17:29:06Z", "iso8601_basic": "20240920T132906567743", "iso8601_basic_short": "20240920T132906", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo", "LSR-TST-br31"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"add<<< 15494 1726853346.92890: stdout chunk (state=3): >>>ress": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "1e:33:24:48:88:76", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2948, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 583, "free": 2948}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 512, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797052416, "block_size": 4096, "block_total": 65519099, "block_available": 63915296, "block_used": 1603803, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15494 1726853346.95278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853346.95282: stdout chunk (state=3): >>><<< 15494 1726853346.95284: stderr chunk (state=3): >>><<< 15494 1726853346.95289: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.72314453125, "5m": 0.38037109375, "15m": 0.1630859375}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "06", "epoch": "1726853346", "epoch_int": "1726853346", "date": "2024-09-20", "time": "13:29:06", "iso8601_micro": "2024-09-20T17:29:06.567743Z", "iso8601": "2024-09-20T17:29:06Z", "iso8601_basic": "20240920T132906567743", "iso8601_basic_short": "20240920T132906", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo", "LSR-TST-br31"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "1e:33:24:48:88:76", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_pkg_mgr": "dnf", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2948, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 583, "free": 2948}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 512, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797052416, "block_size": 4096, "block_total": 65519099, "block_available": 63915296, "block_used": 1603803, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853346.95534: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853346.1732385-16218-260562902523489/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853346.95566: _low_level_execute_command(): starting 15494 1726853346.95580: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853346.1732385-16218-260562902523489/ > /dev/null 2>&1 && sleep 0' 15494 1726853346.96286: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853346.96299: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853346.96311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853346.96375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853346.98267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853346.98272: stdout chunk (state=3): >>><<< 15494 1726853346.98275: stderr chunk (state=3): >>><<< 15494 1726853346.98478: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853346.98481: handler run complete 15494 1726853346.98484: variable 'ansible_facts' from source: unknown 15494 1726853346.98601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853346.98984: variable 'ansible_facts' from source: unknown 15494 1726853346.99273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853346.99418: attempt loop complete, returning result 15494 1726853346.99428: _execute() done 15494 1726853346.99435: dumping result to json 15494 1726853346.99483: done dumping result, returning 15494 1726853346.99495: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-0028-1a50-00000000024e] 15494 1726853346.99504: sending task result for task 02083763-bbaf-0028-1a50-00000000024e ok: [managed_node1] 15494 1726853347.00575: no more pending results, returning what we have 15494 1726853347.00578: results queue empty 15494 1726853347.00579: checking for any_errors_fatal 15494 1726853347.00580: done checking for any_errors_fatal 15494 1726853347.00581: checking for max_fail_percentage 15494 1726853347.00582: done checking for max_fail_percentage 15494 1726853347.00583: checking to see if all hosts have failed and the running result is not ok 15494 1726853347.00584: done checking to see if all hosts have failed 15494 1726853347.00585: getting the remaining hosts for this loop 15494 1726853347.00586: done getting the remaining hosts for this loop 15494 1726853347.00590: getting the next task for host managed_node1 15494 1726853347.00594: done getting next task for host managed_node1 15494 1726853347.00596: ^ task is: TASK: meta (flush_handlers) 15494 1726853347.00598: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853347.00602: getting variables 15494 1726853347.00603: in VariableManager get_vars() 15494 1726853347.00630: Calling all_inventory to load vars for managed_node1 15494 1726853347.00632: Calling groups_inventory to load vars for managed_node1 15494 1726853347.00636: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853347.00642: done sending task result for task 02083763-bbaf-0028-1a50-00000000024e 15494 1726853347.00645: WORKER PROCESS EXITING 15494 1726853347.00657: Calling all_plugins_play to load vars for managed_node1 15494 1726853347.00660: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853347.00663: Calling groups_plugins_play to load vars for managed_node1 15494 1726853347.01991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853347.03821: done with get_vars() 15494 1726853347.03844: done getting variables 15494 1726853347.03926: in VariableManager get_vars() 15494 1726853347.03936: Calling all_inventory to load vars for managed_node1 15494 1726853347.03938: Calling groups_inventory to load vars for managed_node1 15494 1726853347.03941: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853347.03945: Calling all_plugins_play to load vars for managed_node1 15494 1726853347.03951: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853347.03954: Calling groups_plugins_play to load vars for managed_node1 15494 1726853347.05807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853347.09301: done with get_vars() 15494 1726853347.09338: done queuing things up, now waiting for results queue to drain 15494 1726853347.09341: results queue empty 15494 1726853347.09342: checking for any_errors_fatal 15494 1726853347.09349: done checking for any_errors_fatal 15494 1726853347.09350: checking for max_fail_percentage 15494 1726853347.09352: done checking for max_fail_percentage 15494 1726853347.09353: checking to see if all hosts have failed and the running result is not ok 15494 1726853347.09353: done checking to see if all hosts have failed 15494 1726853347.09359: getting the remaining hosts for this loop 15494 1726853347.09360: done getting the remaining hosts for this loop 15494 1726853347.09364: getting the next task for host managed_node1 15494 1726853347.09367: done getting next task for host managed_node1 15494 1726853347.09477: ^ task is: TASK: Include the task '{{ task }}' 15494 1726853347.09483: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853347.09486: getting variables 15494 1726853347.09487: in VariableManager get_vars() 15494 1726853347.09498: Calling all_inventory to load vars for managed_node1 15494 1726853347.09500: Calling groups_inventory to load vars for managed_node1 15494 1726853347.09502: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853347.09508: Calling all_plugins_play to load vars for managed_node1 15494 1726853347.09510: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853347.09513: Calling groups_plugins_play to load vars for managed_node1 15494 1726853347.11086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853347.12811: done with get_vars() 15494 1726853347.12836: done getting variables 15494 1726853347.13002: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_present.yml'] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 13:29:07 -0400 (0:00:01.008) 0:00:15.746 ****** 15494 1726853347.13025: entering _queue_task() for managed_node1/include_tasks 15494 1726853347.13280: worker is 1 (out of 1 available) 15494 1726853347.13293: exiting _queue_task() for managed_node1/include_tasks 15494 1726853347.13306: done queuing things up, now waiting for results queue to drain 15494 1726853347.13307: waiting for pending results... 15494 1726853347.13483: running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_profile_present.yml' 15494 1726853347.13555: in run() - task 02083763-bbaf-0028-1a50-000000000031 15494 1726853347.13568: variable 'ansible_search_path' from source: unknown 15494 1726853347.13598: calling self._execute() 15494 1726853347.13669: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853347.13674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853347.13684: variable 'omit' from source: magic vars 15494 1726853347.13954: variable 'ansible_distribution_major_version' from source: facts 15494 1726853347.13965: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853347.13969: variable 'task' from source: play vars 15494 1726853347.14020: variable 'task' from source: play vars 15494 1726853347.14027: _execute() done 15494 1726853347.14030: dumping result to json 15494 1726853347.14032: done dumping result, returning 15494 1726853347.14039: done running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_profile_present.yml' [02083763-bbaf-0028-1a50-000000000031] 15494 1726853347.14044: sending task result for task 02083763-bbaf-0028-1a50-000000000031 15494 1726853347.14131: done sending task result for task 02083763-bbaf-0028-1a50-000000000031 15494 1726853347.14134: WORKER PROCESS EXITING 15494 1726853347.14165: no more pending results, returning what we have 15494 1726853347.14170: in VariableManager get_vars() 15494 1726853347.14204: Calling all_inventory to load vars for managed_node1 15494 1726853347.14207: Calling groups_inventory to load vars for managed_node1 15494 1726853347.14211: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853347.14225: Calling all_plugins_play to load vars for managed_node1 15494 1726853347.14228: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853347.14231: Calling groups_plugins_play to load vars for managed_node1 15494 1726853347.15074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853347.16597: done with get_vars() 15494 1726853347.16618: variable 'ansible_search_path' from source: unknown 15494 1726853347.16634: we have included files to process 15494 1726853347.16635: generating all_blocks data 15494 1726853347.16637: done generating all_blocks data 15494 1726853347.16638: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15494 1726853347.16639: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15494 1726853347.16641: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15494 1726853347.16833: in VariableManager get_vars() 15494 1726853347.16850: done with get_vars() 15494 1726853347.17092: done processing included file 15494 1726853347.17094: iterating over new_blocks loaded from include file 15494 1726853347.17096: in VariableManager get_vars() 15494 1726853347.17106: done with get_vars() 15494 1726853347.17108: filtering new block on tags 15494 1726853347.17125: done filtering new block on tags 15494 1726853347.17128: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 15494 1726853347.17132: extending task lists for all hosts with included blocks 15494 1726853347.17161: done extending task lists 15494 1726853347.17162: done processing included files 15494 1726853347.17163: results queue empty 15494 1726853347.17164: checking for any_errors_fatal 15494 1726853347.17165: done checking for any_errors_fatal 15494 1726853347.17166: checking for max_fail_percentage 15494 1726853347.17167: done checking for max_fail_percentage 15494 1726853347.17167: checking to see if all hosts have failed and the running result is not ok 15494 1726853347.17168: done checking to see if all hosts have failed 15494 1726853347.17169: getting the remaining hosts for this loop 15494 1726853347.17170: done getting the remaining hosts for this loop 15494 1726853347.17174: getting the next task for host managed_node1 15494 1726853347.17178: done getting next task for host managed_node1 15494 1726853347.17180: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15494 1726853347.17182: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853347.17184: getting variables 15494 1726853347.17185: in VariableManager get_vars() 15494 1726853347.17193: Calling all_inventory to load vars for managed_node1 15494 1726853347.17196: Calling groups_inventory to load vars for managed_node1 15494 1726853347.17198: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853347.17203: Calling all_plugins_play to load vars for managed_node1 15494 1726853347.17206: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853347.17208: Calling groups_plugins_play to load vars for managed_node1 15494 1726853347.18401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853347.19932: done with get_vars() 15494 1726853347.19958: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:29:07 -0400 (0:00:00.070) 0:00:15.816 ****** 15494 1726853347.20036: entering _queue_task() for managed_node1/include_tasks 15494 1726853347.20393: worker is 1 (out of 1 available) 15494 1726853347.20405: exiting _queue_task() for managed_node1/include_tasks 15494 1726853347.20418: done queuing things up, now waiting for results queue to drain 15494 1726853347.20420: waiting for pending results... 15494 1726853347.20798: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 15494 1726853347.20803: in run() - task 02083763-bbaf-0028-1a50-00000000025f 15494 1726853347.20814: variable 'ansible_search_path' from source: unknown 15494 1726853347.20878: variable 'ansible_search_path' from source: unknown 15494 1726853347.20882: calling self._execute() 15494 1726853347.20956: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853347.20966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853347.20982: variable 'omit' from source: magic vars 15494 1726853347.21362: variable 'ansible_distribution_major_version' from source: facts 15494 1726853347.21383: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853347.21395: _execute() done 15494 1726853347.21405: dumping result to json 15494 1726853347.21413: done dumping result, returning 15494 1726853347.21425: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-0028-1a50-00000000025f] 15494 1726853347.21475: sending task result for task 02083763-bbaf-0028-1a50-00000000025f 15494 1726853347.21780: done sending task result for task 02083763-bbaf-0028-1a50-00000000025f 15494 1726853347.21783: WORKER PROCESS EXITING 15494 1726853347.21806: no more pending results, returning what we have 15494 1726853347.21811: in VariableManager get_vars() 15494 1726853347.21841: Calling all_inventory to load vars for managed_node1 15494 1726853347.21844: Calling groups_inventory to load vars for managed_node1 15494 1726853347.21847: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853347.21858: Calling all_plugins_play to load vars for managed_node1 15494 1726853347.21861: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853347.21864: Calling groups_plugins_play to load vars for managed_node1 15494 1726853347.23194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853347.24838: done with get_vars() 15494 1726853347.24856: variable 'ansible_search_path' from source: unknown 15494 1726853347.24858: variable 'ansible_search_path' from source: unknown 15494 1726853347.24867: variable 'task' from source: play vars 15494 1726853347.24975: variable 'task' from source: play vars 15494 1726853347.25009: we have included files to process 15494 1726853347.25010: generating all_blocks data 15494 1726853347.25012: done generating all_blocks data 15494 1726853347.25013: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15494 1726853347.25014: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15494 1726853347.25017: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15494 1726853347.26035: done processing included file 15494 1726853347.26037: iterating over new_blocks loaded from include file 15494 1726853347.26038: in VariableManager get_vars() 15494 1726853347.26051: done with get_vars() 15494 1726853347.26053: filtering new block on tags 15494 1726853347.26078: done filtering new block on tags 15494 1726853347.26081: in VariableManager get_vars() 15494 1726853347.26094: done with get_vars() 15494 1726853347.26096: filtering new block on tags 15494 1726853347.26118: done filtering new block on tags 15494 1726853347.26120: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 15494 1726853347.26126: extending task lists for all hosts with included blocks 15494 1726853347.26295: done extending task lists 15494 1726853347.26297: done processing included files 15494 1726853347.26298: results queue empty 15494 1726853347.26299: checking for any_errors_fatal 15494 1726853347.26303: done checking for any_errors_fatal 15494 1726853347.26303: checking for max_fail_percentage 15494 1726853347.26305: done checking for max_fail_percentage 15494 1726853347.26305: checking to see if all hosts have failed and the running result is not ok 15494 1726853347.26306: done checking to see if all hosts have failed 15494 1726853347.26307: getting the remaining hosts for this loop 15494 1726853347.26308: done getting the remaining hosts for this loop 15494 1726853347.26311: getting the next task for host managed_node1 15494 1726853347.26315: done getting next task for host managed_node1 15494 1726853347.26317: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15494 1726853347.26320: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853347.26323: getting variables 15494 1726853347.26324: in VariableManager get_vars() 15494 1726853347.29943: Calling all_inventory to load vars for managed_node1 15494 1726853347.29946: Calling groups_inventory to load vars for managed_node1 15494 1726853347.29948: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853347.29953: Calling all_plugins_play to load vars for managed_node1 15494 1726853347.29955: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853347.29956: Calling groups_plugins_play to load vars for managed_node1 15494 1726853347.30601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853347.31926: done with get_vars() 15494 1726853347.31949: done getting variables 15494 1726853347.31996: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:29:07 -0400 (0:00:00.119) 0:00:15.936 ****** 15494 1726853347.32022: entering _queue_task() for managed_node1/set_fact 15494 1726853347.32367: worker is 1 (out of 1 available) 15494 1726853347.32380: exiting _queue_task() for managed_node1/set_fact 15494 1726853347.32396: done queuing things up, now waiting for results queue to drain 15494 1726853347.32397: waiting for pending results... 15494 1726853347.32668: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 15494 1726853347.32756: in run() - task 02083763-bbaf-0028-1a50-00000000026c 15494 1726853347.32768: variable 'ansible_search_path' from source: unknown 15494 1726853347.32773: variable 'ansible_search_path' from source: unknown 15494 1726853347.32801: calling self._execute() 15494 1726853347.32875: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853347.32879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853347.32889: variable 'omit' from source: magic vars 15494 1726853347.33168: variable 'ansible_distribution_major_version' from source: facts 15494 1726853347.33182: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853347.33185: variable 'omit' from source: magic vars 15494 1726853347.33225: variable 'omit' from source: magic vars 15494 1726853347.33251: variable 'omit' from source: magic vars 15494 1726853347.33288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853347.33311: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853347.33327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853347.33340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853347.33352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853347.33375: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853347.33379: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853347.33382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853347.33452: Set connection var ansible_connection to ssh 15494 1726853347.33455: Set connection var ansible_pipelining to False 15494 1726853347.33458: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853347.33460: Set connection var ansible_shell_type to sh 15494 1726853347.33466: Set connection var ansible_timeout to 10 15494 1726853347.33474: Set connection var ansible_shell_executable to /bin/sh 15494 1726853347.33492: variable 'ansible_shell_executable' from source: unknown 15494 1726853347.33495: variable 'ansible_connection' from source: unknown 15494 1726853347.33506: variable 'ansible_module_compression' from source: unknown 15494 1726853347.33509: variable 'ansible_shell_type' from source: unknown 15494 1726853347.33511: variable 'ansible_shell_executable' from source: unknown 15494 1726853347.33513: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853347.33515: variable 'ansible_pipelining' from source: unknown 15494 1726853347.33517: variable 'ansible_timeout' from source: unknown 15494 1726853347.33519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853347.33615: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853347.33622: variable 'omit' from source: magic vars 15494 1726853347.33629: starting attempt loop 15494 1726853347.33633: running the handler 15494 1726853347.33642: handler run complete 15494 1726853347.33650: attempt loop complete, returning result 15494 1726853347.33657: _execute() done 15494 1726853347.33660: dumping result to json 15494 1726853347.33662: done dumping result, returning 15494 1726853347.33669: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-0028-1a50-00000000026c] 15494 1726853347.33676: sending task result for task 02083763-bbaf-0028-1a50-00000000026c 15494 1726853347.33757: done sending task result for task 02083763-bbaf-0028-1a50-00000000026c 15494 1726853347.33759: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15494 1726853347.33831: no more pending results, returning what we have 15494 1726853347.33835: results queue empty 15494 1726853347.33836: checking for any_errors_fatal 15494 1726853347.33837: done checking for any_errors_fatal 15494 1726853347.33838: checking for max_fail_percentage 15494 1726853347.33839: done checking for max_fail_percentage 15494 1726853347.33840: checking to see if all hosts have failed and the running result is not ok 15494 1726853347.33841: done checking to see if all hosts have failed 15494 1726853347.33842: getting the remaining hosts for this loop 15494 1726853347.33843: done getting the remaining hosts for this loop 15494 1726853347.33849: getting the next task for host managed_node1 15494 1726853347.33856: done getting next task for host managed_node1 15494 1726853347.33859: ^ task is: TASK: Stat profile file 15494 1726853347.33862: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853347.33867: getting variables 15494 1726853347.33876: in VariableManager get_vars() 15494 1726853347.33904: Calling all_inventory to load vars for managed_node1 15494 1726853347.33907: Calling groups_inventory to load vars for managed_node1 15494 1726853347.33910: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853347.33918: Calling all_plugins_play to load vars for managed_node1 15494 1726853347.33920: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853347.33923: Calling groups_plugins_play to load vars for managed_node1 15494 1726853347.35022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853347.36128: done with get_vars() 15494 1726853347.36143: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:29:07 -0400 (0:00:00.041) 0:00:15.977 ****** 15494 1726853347.36214: entering _queue_task() for managed_node1/stat 15494 1726853347.36454: worker is 1 (out of 1 available) 15494 1726853347.36467: exiting _queue_task() for managed_node1/stat 15494 1726853347.36481: done queuing things up, now waiting for results queue to drain 15494 1726853347.36482: waiting for pending results... 15494 1726853347.36649: running TaskExecutor() for managed_node1/TASK: Stat profile file 15494 1726853347.36723: in run() - task 02083763-bbaf-0028-1a50-00000000026d 15494 1726853347.36734: variable 'ansible_search_path' from source: unknown 15494 1726853347.36737: variable 'ansible_search_path' from source: unknown 15494 1726853347.36768: calling self._execute() 15494 1726853347.36845: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853347.36853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853347.36862: variable 'omit' from source: magic vars 15494 1726853347.37141: variable 'ansible_distribution_major_version' from source: facts 15494 1726853347.37153: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853347.37158: variable 'omit' from source: magic vars 15494 1726853347.37189: variable 'omit' from source: magic vars 15494 1726853347.37256: variable 'profile' from source: play vars 15494 1726853347.37267: variable 'interface' from source: set_fact 15494 1726853347.37314: variable 'interface' from source: set_fact 15494 1726853347.37328: variable 'omit' from source: magic vars 15494 1726853347.37360: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853347.37391: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853347.37406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853347.37419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853347.37430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853347.37455: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853347.37458: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853347.37461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853347.37530: Set connection var ansible_connection to ssh 15494 1726853347.37533: Set connection var ansible_pipelining to False 15494 1726853347.37539: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853347.37542: Set connection var ansible_shell_type to sh 15494 1726853347.37550: Set connection var ansible_timeout to 10 15494 1726853347.37554: Set connection var ansible_shell_executable to /bin/sh 15494 1726853347.37572: variable 'ansible_shell_executable' from source: unknown 15494 1726853347.37575: variable 'ansible_connection' from source: unknown 15494 1726853347.37578: variable 'ansible_module_compression' from source: unknown 15494 1726853347.37587: variable 'ansible_shell_type' from source: unknown 15494 1726853347.37590: variable 'ansible_shell_executable' from source: unknown 15494 1726853347.37592: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853347.37595: variable 'ansible_pipelining' from source: unknown 15494 1726853347.37597: variable 'ansible_timeout' from source: unknown 15494 1726853347.37599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853347.37732: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853347.37743: variable 'omit' from source: magic vars 15494 1726853347.37748: starting attempt loop 15494 1726853347.37751: running the handler 15494 1726853347.37856: _low_level_execute_command(): starting 15494 1726853347.37859: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853347.38555: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853347.38561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853347.38593: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853347.40257: stdout chunk (state=3): >>>/root <<< 15494 1726853347.40386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853347.40421: stderr chunk (state=3): >>><<< 15494 1726853347.40424: stdout chunk (state=3): >>><<< 15494 1726853347.40457: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853347.40493: _low_level_execute_command(): starting 15494 1726853347.40497: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853347.40456-16252-222623472995645 `" && echo ansible-tmp-1726853347.40456-16252-222623472995645="` echo /root/.ansible/tmp/ansible-tmp-1726853347.40456-16252-222623472995645 `" ) && sleep 0' 15494 1726853347.41037: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853347.41040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853347.41043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15494 1726853347.41055: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853347.41058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853347.41102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853347.41149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853347.41218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853347.43073: stdout chunk (state=3): >>>ansible-tmp-1726853347.40456-16252-222623472995645=/root/.ansible/tmp/ansible-tmp-1726853347.40456-16252-222623472995645 <<< 15494 1726853347.43167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853347.43199: stderr chunk (state=3): >>><<< 15494 1726853347.43202: stdout chunk (state=3): >>><<< 15494 1726853347.43217: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853347.40456-16252-222623472995645=/root/.ansible/tmp/ansible-tmp-1726853347.40456-16252-222623472995645 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853347.43255: variable 'ansible_module_compression' from source: unknown 15494 1726853347.43305: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15494 1726853347.43338: variable 'ansible_facts' from source: unknown 15494 1726853347.43404: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853347.40456-16252-222623472995645/AnsiballZ_stat.py 15494 1726853347.43503: Sending initial data 15494 1726853347.43507: Sent initial data (151 bytes) 15494 1726853347.43954: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853347.43957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853347.43959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853347.43962: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853347.43964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853347.44014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853347.44021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853347.44023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853347.44058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853347.45583: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15494 1726853347.45587: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853347.45648: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853347.45675: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpaf7fm_d6 /root/.ansible/tmp/ansible-tmp-1726853347.40456-16252-222623472995645/AnsiballZ_stat.py <<< 15494 1726853347.45703: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853347.40456-16252-222623472995645/AnsiballZ_stat.py" <<< 15494 1726853347.45750: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpaf7fm_d6" to remote "/root/.ansible/tmp/ansible-tmp-1726853347.40456-16252-222623472995645/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853347.40456-16252-222623472995645/AnsiballZ_stat.py" <<< 15494 1726853347.46470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853347.46516: stderr chunk (state=3): >>><<< 15494 1726853347.46519: stdout chunk (state=3): >>><<< 15494 1726853347.46584: done transferring module to remote 15494 1726853347.46588: _low_level_execute_command(): starting 15494 1726853347.46618: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853347.40456-16252-222623472995645/ /root/.ansible/tmp/ansible-tmp-1726853347.40456-16252-222623472995645/AnsiballZ_stat.py && sleep 0' 15494 1726853347.47211: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853347.47216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853347.47218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853347.47220: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853347.47225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853347.47287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853347.47291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853347.47350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853347.49067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853347.49144: stderr chunk (state=3): >>><<< 15494 1726853347.49151: stdout chunk (state=3): >>><<< 15494 1726853347.49257: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853347.49264: _low_level_execute_command(): starting 15494 1726853347.49266: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853347.40456-16252-222623472995645/AnsiballZ_stat.py && sleep 0' 15494 1726853347.49836: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853347.49844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853347.49885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853347.49967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853347.49986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853347.50016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853347.65359: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15494 1726853347.66463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853347.66468: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 15494 1726853347.66531: stderr chunk (state=3): >>><<< 15494 1726853347.66534: stdout chunk (state=3): >>><<< 15494 1726853347.66659: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853347.66663: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853347.40456-16252-222623472995645/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853347.66666: _low_level_execute_command(): starting 15494 1726853347.66667: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853347.40456-16252-222623472995645/ > /dev/null 2>&1 && sleep 0' 15494 1726853347.67389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853347.67394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853347.67576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853347.67580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853347.67582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853347.67584: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853347.67596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853347.67604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853347.67608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853347.67821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853347.69678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853347.69682: stdout chunk (state=3): >>><<< 15494 1726853347.69777: stderr chunk (state=3): >>><<< 15494 1726853347.69781: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853347.69783: handler run complete 15494 1726853347.69786: attempt loop complete, returning result 15494 1726853347.69788: _execute() done 15494 1726853347.69790: dumping result to json 15494 1726853347.69792: done dumping result, returning 15494 1726853347.69794: done running TaskExecutor() for managed_node1/TASK: Stat profile file [02083763-bbaf-0028-1a50-00000000026d] 15494 1726853347.69796: sending task result for task 02083763-bbaf-0028-1a50-00000000026d 15494 1726853347.69857: done sending task result for task 02083763-bbaf-0028-1a50-00000000026d 15494 1726853347.69860: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15494 1726853347.69935: no more pending results, returning what we have 15494 1726853347.69939: results queue empty 15494 1726853347.69940: checking for any_errors_fatal 15494 1726853347.69947: done checking for any_errors_fatal 15494 1726853347.69948: checking for max_fail_percentage 15494 1726853347.69949: done checking for max_fail_percentage 15494 1726853347.69950: checking to see if all hosts have failed and the running result is not ok 15494 1726853347.69951: done checking to see if all hosts have failed 15494 1726853347.69952: getting the remaining hosts for this loop 15494 1726853347.69954: done getting the remaining hosts for this loop 15494 1726853347.69958: getting the next task for host managed_node1 15494 1726853347.69966: done getting next task for host managed_node1 15494 1726853347.69969: ^ task is: TASK: Set NM profile exist flag based on the profile files 15494 1726853347.69974: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853347.69978: getting variables 15494 1726853347.69983: in VariableManager get_vars() 15494 1726853347.70013: Calling all_inventory to load vars for managed_node1 15494 1726853347.70016: Calling groups_inventory to load vars for managed_node1 15494 1726853347.70019: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853347.70031: Calling all_plugins_play to load vars for managed_node1 15494 1726853347.70034: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853347.70037: Calling groups_plugins_play to load vars for managed_node1 15494 1726853347.72860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853347.76237: done with get_vars() 15494 1726853347.76380: done getting variables 15494 1726853347.76441: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:29:07 -0400 (0:00:00.403) 0:00:16.381 ****** 15494 1726853347.76587: entering _queue_task() for managed_node1/set_fact 15494 1726853347.77348: worker is 1 (out of 1 available) 15494 1726853347.77361: exiting _queue_task() for managed_node1/set_fact 15494 1726853347.77374: done queuing things up, now waiting for results queue to drain 15494 1726853347.77376: waiting for pending results... 15494 1726853347.77888: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 15494 1726853347.77893: in run() - task 02083763-bbaf-0028-1a50-00000000026e 15494 1726853347.77896: variable 'ansible_search_path' from source: unknown 15494 1726853347.77898: variable 'ansible_search_path' from source: unknown 15494 1726853347.78277: calling self._execute() 15494 1726853347.78281: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853347.78284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853347.78286: variable 'omit' from source: magic vars 15494 1726853347.78969: variable 'ansible_distribution_major_version' from source: facts 15494 1726853347.79276: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853347.79319: variable 'profile_stat' from source: set_fact 15494 1726853347.79338: Evaluated conditional (profile_stat.stat.exists): False 15494 1726853347.79345: when evaluation is False, skipping this task 15494 1726853347.79676: _execute() done 15494 1726853347.79679: dumping result to json 15494 1726853347.79682: done dumping result, returning 15494 1726853347.79684: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-0028-1a50-00000000026e] 15494 1726853347.79687: sending task result for task 02083763-bbaf-0028-1a50-00000000026e 15494 1726853347.79758: done sending task result for task 02083763-bbaf-0028-1a50-00000000026e 15494 1726853347.79762: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15494 1726853347.79815: no more pending results, returning what we have 15494 1726853347.79819: results queue empty 15494 1726853347.79820: checking for any_errors_fatal 15494 1726853347.79831: done checking for any_errors_fatal 15494 1726853347.79831: checking for max_fail_percentage 15494 1726853347.79833: done checking for max_fail_percentage 15494 1726853347.79834: checking to see if all hosts have failed and the running result is not ok 15494 1726853347.79834: done checking to see if all hosts have failed 15494 1726853347.79835: getting the remaining hosts for this loop 15494 1726853347.79836: done getting the remaining hosts for this loop 15494 1726853347.79842: getting the next task for host managed_node1 15494 1726853347.79849: done getting next task for host managed_node1 15494 1726853347.79853: ^ task is: TASK: Get NM profile info 15494 1726853347.79856: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853347.79859: getting variables 15494 1726853347.79861: in VariableManager get_vars() 15494 1726853347.79893: Calling all_inventory to load vars for managed_node1 15494 1726853347.79896: Calling groups_inventory to load vars for managed_node1 15494 1726853347.79899: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853347.79910: Calling all_plugins_play to load vars for managed_node1 15494 1726853347.79912: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853347.79915: Calling groups_plugins_play to load vars for managed_node1 15494 1726853347.83063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853347.86427: done with get_vars() 15494 1726853347.86453: done getting variables 15494 1726853347.86678: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:29:07 -0400 (0:00:00.102) 0:00:16.483 ****** 15494 1726853347.86792: entering _queue_task() for managed_node1/shell 15494 1726853347.86794: Creating lock for shell 15494 1726853347.87584: worker is 1 (out of 1 available) 15494 1726853347.87594: exiting _queue_task() for managed_node1/shell 15494 1726853347.87605: done queuing things up, now waiting for results queue to drain 15494 1726853347.87606: waiting for pending results... 15494 1726853347.88186: running TaskExecutor() for managed_node1/TASK: Get NM profile info 15494 1726853347.88190: in run() - task 02083763-bbaf-0028-1a50-00000000026f 15494 1726853347.88194: variable 'ansible_search_path' from source: unknown 15494 1726853347.88196: variable 'ansible_search_path' from source: unknown 15494 1726853347.88577: calling self._execute() 15494 1726853347.88581: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853347.88584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853347.88587: variable 'omit' from source: magic vars 15494 1726853347.89253: variable 'ansible_distribution_major_version' from source: facts 15494 1726853347.89270: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853347.89283: variable 'omit' from source: magic vars 15494 1726853347.89337: variable 'omit' from source: magic vars 15494 1726853347.89675: variable 'profile' from source: play vars 15494 1726853347.89684: variable 'interface' from source: set_fact 15494 1726853347.89751: variable 'interface' from source: set_fact 15494 1726853347.89779: variable 'omit' from source: magic vars 15494 1726853347.89826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853347.90110: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853347.90275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853347.90279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853347.90281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853347.90283: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853347.90286: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853347.90287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853347.90323: Set connection var ansible_connection to ssh 15494 1726853347.90776: Set connection var ansible_pipelining to False 15494 1726853347.90779: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853347.90781: Set connection var ansible_shell_type to sh 15494 1726853347.90783: Set connection var ansible_timeout to 10 15494 1726853347.90785: Set connection var ansible_shell_executable to /bin/sh 15494 1726853347.90787: variable 'ansible_shell_executable' from source: unknown 15494 1726853347.90789: variable 'ansible_connection' from source: unknown 15494 1726853347.90791: variable 'ansible_module_compression' from source: unknown 15494 1726853347.90793: variable 'ansible_shell_type' from source: unknown 15494 1726853347.90795: variable 'ansible_shell_executable' from source: unknown 15494 1726853347.90796: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853347.90798: variable 'ansible_pipelining' from source: unknown 15494 1726853347.90800: variable 'ansible_timeout' from source: unknown 15494 1726853347.90802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853347.90825: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853347.90842: variable 'omit' from source: magic vars 15494 1726853347.90984: starting attempt loop 15494 1726853347.90992: running the handler 15494 1726853347.91006: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853347.91031: _low_level_execute_command(): starting 15494 1726853347.91044: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853347.92422: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853347.92436: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853347.92486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853347.92500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853347.92513: stderr chunk (state=3): >>>debug2: match found <<< 15494 1726853347.92583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853347.92603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853347.92630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853347.92652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853347.92732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853347.94421: stdout chunk (state=3): >>>/root <<< 15494 1726853347.94524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853347.94727: stderr chunk (state=3): >>><<< 15494 1726853347.94731: stdout chunk (state=3): >>><<< 15494 1726853347.94759: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853347.94775: _low_level_execute_command(): starting 15494 1726853347.94782: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853347.947596-16269-18689763817663 `" && echo ansible-tmp-1726853347.947596-16269-18689763817663="` echo /root/.ansible/tmp/ansible-tmp-1726853347.947596-16269-18689763817663 `" ) && sleep 0' 15494 1726853347.95964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853347.95968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853347.95973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853347.95975: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853347.95978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853347.96084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853347.96097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853347.96156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853347.98048: stdout chunk (state=3): >>>ansible-tmp-1726853347.947596-16269-18689763817663=/root/.ansible/tmp/ansible-tmp-1726853347.947596-16269-18689763817663 <<< 15494 1726853347.98151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853347.98185: stderr chunk (state=3): >>><<< 15494 1726853347.98195: stdout chunk (state=3): >>><<< 15494 1726853347.98218: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853347.947596-16269-18689763817663=/root/.ansible/tmp/ansible-tmp-1726853347.947596-16269-18689763817663 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853347.98258: variable 'ansible_module_compression' from source: unknown 15494 1726853347.98426: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15494 1726853347.98476: variable 'ansible_facts' from source: unknown 15494 1726853347.98838: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853347.947596-16269-18689763817663/AnsiballZ_command.py 15494 1726853347.98987: Sending initial data 15494 1726853347.98998: Sent initial data (154 bytes) 15494 1726853348.00126: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853348.00129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853348.00132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853348.00135: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853348.00137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853348.00139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853348.00310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853348.00322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853348.00378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853348.01922: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853348.01981: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853348.02013: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpyoapjqy_ /root/.ansible/tmp/ansible-tmp-1726853347.947596-16269-18689763817663/AnsiballZ_command.py <<< 15494 1726853348.02020: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853347.947596-16269-18689763817663/AnsiballZ_command.py" <<< 15494 1726853348.02229: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpyoapjqy_" to remote "/root/.ansible/tmp/ansible-tmp-1726853347.947596-16269-18689763817663/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853347.947596-16269-18689763817663/AnsiballZ_command.py" <<< 15494 1726853348.03555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853348.03612: stderr chunk (state=3): >>><<< 15494 1726853348.03622: stdout chunk (state=3): >>><<< 15494 1726853348.03656: done transferring module to remote 15494 1726853348.03866: _low_level_execute_command(): starting 15494 1726853348.03869: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853347.947596-16269-18689763817663/ /root/.ansible/tmp/ansible-tmp-1726853347.947596-16269-18689763817663/AnsiballZ_command.py && sleep 0' 15494 1726853348.04937: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853348.04954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853348.05062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853348.05078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853348.05177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853348.05357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853348.07120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853348.07123: stdout chunk (state=3): >>><<< 15494 1726853348.07125: stderr chunk (state=3): >>><<< 15494 1726853348.07140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853348.07148: _low_level_execute_command(): starting 15494 1726853348.07256: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853347.947596-16269-18689763817663/AnsiballZ_command.py && sleep 0' 15494 1726853348.07951: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853348.07963: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853348.08086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853348.08100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853348.08118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853348.08190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853348.25085: stdout chunk (state=3): >>> {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 13:29:08.232928", "end": "2024-09-20 13:29:08.249977", "delta": "0:00:00.017049", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15494 1726853348.26650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853348.26676: stderr chunk (state=3): >>><<< 15494 1726853348.26682: stdout chunk (state=3): >>><<< 15494 1726853348.26700: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 13:29:08.232928", "end": "2024-09-20 13:29:08.249977", "delta": "0:00:00.017049", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853348.26726: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853347.947596-16269-18689763817663/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853348.26734: _low_level_execute_command(): starting 15494 1726853348.26738: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853347.947596-16269-18689763817663/ > /dev/null 2>&1 && sleep 0' 15494 1726853348.27369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853348.27407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853348.27410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853348.27577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853348.27580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853348.29402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853348.29416: stderr chunk (state=3): >>><<< 15494 1726853348.29419: stdout chunk (state=3): >>><<< 15494 1726853348.29432: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853348.29438: handler run complete 15494 1726853348.29463: Evaluated conditional (False): False 15494 1726853348.29466: attempt loop complete, returning result 15494 1726853348.29469: _execute() done 15494 1726853348.29473: dumping result to json 15494 1726853348.29498: done dumping result, returning 15494 1726853348.29502: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [02083763-bbaf-0028-1a50-00000000026f] 15494 1726853348.29504: sending task result for task 02083763-bbaf-0028-1a50-00000000026f 15494 1726853348.29597: done sending task result for task 02083763-bbaf-0028-1a50-00000000026f 15494 1726853348.29600: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.017049", "end": "2024-09-20 13:29:08.249977", "rc": 0, "start": "2024-09-20 13:29:08.232928" } STDOUT: LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection 15494 1726853348.29706: no more pending results, returning what we have 15494 1726853348.29710: results queue empty 15494 1726853348.29711: checking for any_errors_fatal 15494 1726853348.29719: done checking for any_errors_fatal 15494 1726853348.29720: checking for max_fail_percentage 15494 1726853348.29722: done checking for max_fail_percentage 15494 1726853348.29724: checking to see if all hosts have failed and the running result is not ok 15494 1726853348.29725: done checking to see if all hosts have failed 15494 1726853348.29736: getting the remaining hosts for this loop 15494 1726853348.29737: done getting the remaining hosts for this loop 15494 1726853348.29741: getting the next task for host managed_node1 15494 1726853348.29750: done getting next task for host managed_node1 15494 1726853348.29753: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15494 1726853348.29756: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853348.29760: getting variables 15494 1726853348.29762: in VariableManager get_vars() 15494 1726853348.29793: Calling all_inventory to load vars for managed_node1 15494 1726853348.29795: Calling groups_inventory to load vars for managed_node1 15494 1726853348.29798: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853348.29808: Calling all_plugins_play to load vars for managed_node1 15494 1726853348.29810: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853348.29813: Calling groups_plugins_play to load vars for managed_node1 15494 1726853348.30891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853348.32421: done with get_vars() 15494 1726853348.32441: done getting variables 15494 1726853348.32491: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:29:08 -0400 (0:00:00.457) 0:00:16.940 ****** 15494 1726853348.32520: entering _queue_task() for managed_node1/set_fact 15494 1726853348.32843: worker is 1 (out of 1 available) 15494 1726853348.32856: exiting _queue_task() for managed_node1/set_fact 15494 1726853348.32870: done queuing things up, now waiting for results queue to drain 15494 1726853348.32876: waiting for pending results... 15494 1726853348.33114: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15494 1726853348.33188: in run() - task 02083763-bbaf-0028-1a50-000000000270 15494 1726853348.33199: variable 'ansible_search_path' from source: unknown 15494 1726853348.33203: variable 'ansible_search_path' from source: unknown 15494 1726853348.33234: calling self._execute() 15494 1726853348.33318: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.33322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.33330: variable 'omit' from source: magic vars 15494 1726853348.33684: variable 'ansible_distribution_major_version' from source: facts 15494 1726853348.33695: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853348.33896: variable 'nm_profile_exists' from source: set_fact 15494 1726853348.33935: Evaluated conditional (nm_profile_exists.rc == 0): True 15494 1726853348.33938: variable 'omit' from source: magic vars 15494 1726853348.34022: variable 'omit' from source: magic vars 15494 1726853348.34046: variable 'omit' from source: magic vars 15494 1726853348.34113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853348.34150: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853348.34213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853348.34217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853348.34219: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853348.34482: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853348.34486: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.34490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.34492: Set connection var ansible_connection to ssh 15494 1726853348.34494: Set connection var ansible_pipelining to False 15494 1726853348.34496: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853348.34498: Set connection var ansible_shell_type to sh 15494 1726853348.34500: Set connection var ansible_timeout to 10 15494 1726853348.34502: Set connection var ansible_shell_executable to /bin/sh 15494 1726853348.34504: variable 'ansible_shell_executable' from source: unknown 15494 1726853348.34506: variable 'ansible_connection' from source: unknown 15494 1726853348.34508: variable 'ansible_module_compression' from source: unknown 15494 1726853348.34510: variable 'ansible_shell_type' from source: unknown 15494 1726853348.34512: variable 'ansible_shell_executable' from source: unknown 15494 1726853348.34514: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.34515: variable 'ansible_pipelining' from source: unknown 15494 1726853348.34517: variable 'ansible_timeout' from source: unknown 15494 1726853348.34520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.34749: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853348.34787: variable 'omit' from source: magic vars 15494 1726853348.34795: starting attempt loop 15494 1726853348.34798: running the handler 15494 1726853348.34838: handler run complete 15494 1726853348.34856: attempt loop complete, returning result 15494 1726853348.34859: _execute() done 15494 1726853348.34862: dumping result to json 15494 1726853348.34864: done dumping result, returning 15494 1726853348.34867: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-0028-1a50-000000000270] 15494 1726853348.34873: sending task result for task 02083763-bbaf-0028-1a50-000000000270 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 15494 1726853348.35082: no more pending results, returning what we have 15494 1726853348.35086: results queue empty 15494 1726853348.35087: checking for any_errors_fatal 15494 1726853348.35096: done checking for any_errors_fatal 15494 1726853348.35096: checking for max_fail_percentage 15494 1726853348.35098: done checking for max_fail_percentage 15494 1726853348.35099: checking to see if all hosts have failed and the running result is not ok 15494 1726853348.35100: done checking to see if all hosts have failed 15494 1726853348.35100: getting the remaining hosts for this loop 15494 1726853348.35102: done getting the remaining hosts for this loop 15494 1726853348.35105: getting the next task for host managed_node1 15494 1726853348.35115: done getting next task for host managed_node1 15494 1726853348.35117: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15494 1726853348.35120: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853348.35124: getting variables 15494 1726853348.35125: in VariableManager get_vars() 15494 1726853348.35153: Calling all_inventory to load vars for managed_node1 15494 1726853348.35156: Calling groups_inventory to load vars for managed_node1 15494 1726853348.35159: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853348.35170: Calling all_plugins_play to load vars for managed_node1 15494 1726853348.35185: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853348.35193: done sending task result for task 02083763-bbaf-0028-1a50-000000000270 15494 1726853348.35195: WORKER PROCESS EXITING 15494 1726853348.35199: Calling groups_plugins_play to load vars for managed_node1 15494 1726853348.36891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853348.38362: done with get_vars() 15494 1726853348.38381: done getting variables 15494 1726853348.38425: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15494 1726853348.38519: variable 'profile' from source: play vars 15494 1726853348.38522: variable 'interface' from source: set_fact 15494 1726853348.38566: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:29:08 -0400 (0:00:00.060) 0:00:17.001 ****** 15494 1726853348.38596: entering _queue_task() for managed_node1/command 15494 1726853348.38869: worker is 1 (out of 1 available) 15494 1726853348.38885: exiting _queue_task() for managed_node1/command 15494 1726853348.38902: done queuing things up, now waiting for results queue to drain 15494 1726853348.38904: waiting for pending results... 15494 1726853348.39072: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 15494 1726853348.39377: in run() - task 02083763-bbaf-0028-1a50-000000000272 15494 1726853348.39381: variable 'ansible_search_path' from source: unknown 15494 1726853348.39384: variable 'ansible_search_path' from source: unknown 15494 1726853348.39390: calling self._execute() 15494 1726853348.39394: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.39397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.39399: variable 'omit' from source: magic vars 15494 1726853348.39981: variable 'ansible_distribution_major_version' from source: facts 15494 1726853348.39992: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853348.40187: variable 'profile_stat' from source: set_fact 15494 1726853348.40274: Evaluated conditional (profile_stat.stat.exists): False 15494 1726853348.40279: when evaluation is False, skipping this task 15494 1726853348.40282: _execute() done 15494 1726853348.40286: dumping result to json 15494 1726853348.40288: done dumping result, returning 15494 1726853348.40291: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [02083763-bbaf-0028-1a50-000000000272] 15494 1726853348.40293: sending task result for task 02083763-bbaf-0028-1a50-000000000272 15494 1726853348.40359: done sending task result for task 02083763-bbaf-0028-1a50-000000000272 15494 1726853348.40362: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15494 1726853348.40420: no more pending results, returning what we have 15494 1726853348.40429: results queue empty 15494 1726853348.40430: checking for any_errors_fatal 15494 1726853348.40441: done checking for any_errors_fatal 15494 1726853348.40442: checking for max_fail_percentage 15494 1726853348.40443: done checking for max_fail_percentage 15494 1726853348.40444: checking to see if all hosts have failed and the running result is not ok 15494 1726853348.40445: done checking to see if all hosts have failed 15494 1726853348.40446: getting the remaining hosts for this loop 15494 1726853348.40447: done getting the remaining hosts for this loop 15494 1726853348.40451: getting the next task for host managed_node1 15494 1726853348.40459: done getting next task for host managed_node1 15494 1726853348.40461: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15494 1726853348.40464: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853348.40467: getting variables 15494 1726853348.40469: in VariableManager get_vars() 15494 1726853348.40604: Calling all_inventory to load vars for managed_node1 15494 1726853348.40606: Calling groups_inventory to load vars for managed_node1 15494 1726853348.40610: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853348.40622: Calling all_plugins_play to load vars for managed_node1 15494 1726853348.40627: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853348.40635: Calling groups_plugins_play to load vars for managed_node1 15494 1726853348.41805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853348.43008: done with get_vars() 15494 1726853348.43025: done getting variables 15494 1726853348.43083: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15494 1726853348.43172: variable 'profile' from source: play vars 15494 1726853348.43175: variable 'interface' from source: set_fact 15494 1726853348.43237: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:29:08 -0400 (0:00:00.046) 0:00:17.048 ****** 15494 1726853348.43268: entering _queue_task() for managed_node1/set_fact 15494 1726853348.43619: worker is 1 (out of 1 available) 15494 1726853348.43632: exiting _queue_task() for managed_node1/set_fact 15494 1726853348.43645: done queuing things up, now waiting for results queue to drain 15494 1726853348.43647: waiting for pending results... 15494 1726853348.44090: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 15494 1726853348.44098: in run() - task 02083763-bbaf-0028-1a50-000000000273 15494 1726853348.44102: variable 'ansible_search_path' from source: unknown 15494 1726853348.44105: variable 'ansible_search_path' from source: unknown 15494 1726853348.44144: calling self._execute() 15494 1726853348.44256: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.44260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.44272: variable 'omit' from source: magic vars 15494 1726853348.44613: variable 'ansible_distribution_major_version' from source: facts 15494 1726853348.44616: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853348.44804: variable 'profile_stat' from source: set_fact 15494 1726853348.44808: Evaluated conditional (profile_stat.stat.exists): False 15494 1726853348.44810: when evaluation is False, skipping this task 15494 1726853348.44813: _execute() done 15494 1726853348.44815: dumping result to json 15494 1726853348.44817: done dumping result, returning 15494 1726853348.44830: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [02083763-bbaf-0028-1a50-000000000273] 15494 1726853348.44835: sending task result for task 02083763-bbaf-0028-1a50-000000000273 15494 1726853348.44908: done sending task result for task 02083763-bbaf-0028-1a50-000000000273 15494 1726853348.44910: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15494 1726853348.45017: no more pending results, returning what we have 15494 1726853348.45020: results queue empty 15494 1726853348.45021: checking for any_errors_fatal 15494 1726853348.45027: done checking for any_errors_fatal 15494 1726853348.45027: checking for max_fail_percentage 15494 1726853348.45029: done checking for max_fail_percentage 15494 1726853348.45029: checking to see if all hosts have failed and the running result is not ok 15494 1726853348.45030: done checking to see if all hosts have failed 15494 1726853348.45031: getting the remaining hosts for this loop 15494 1726853348.45032: done getting the remaining hosts for this loop 15494 1726853348.45035: getting the next task for host managed_node1 15494 1726853348.45041: done getting next task for host managed_node1 15494 1726853348.45043: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15494 1726853348.45050: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853348.45053: getting variables 15494 1726853348.45054: in VariableManager get_vars() 15494 1726853348.45083: Calling all_inventory to load vars for managed_node1 15494 1726853348.45088: Calling groups_inventory to load vars for managed_node1 15494 1726853348.45091: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853348.45106: Calling all_plugins_play to load vars for managed_node1 15494 1726853348.45109: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853348.45117: Calling groups_plugins_play to load vars for managed_node1 15494 1726853348.46569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853348.48073: done with get_vars() 15494 1726853348.48101: done getting variables 15494 1726853348.48169: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15494 1726853348.48285: variable 'profile' from source: play vars 15494 1726853348.48289: variable 'interface' from source: set_fact 15494 1726853348.48332: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:29:08 -0400 (0:00:00.050) 0:00:17.099 ****** 15494 1726853348.48356: entering _queue_task() for managed_node1/command 15494 1726853348.48691: worker is 1 (out of 1 available) 15494 1726853348.48705: exiting _queue_task() for managed_node1/command 15494 1726853348.48719: done queuing things up, now waiting for results queue to drain 15494 1726853348.48720: waiting for pending results... 15494 1726853348.48931: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 15494 1726853348.49029: in run() - task 02083763-bbaf-0028-1a50-000000000274 15494 1726853348.49054: variable 'ansible_search_path' from source: unknown 15494 1726853348.49058: variable 'ansible_search_path' from source: unknown 15494 1726853348.49085: calling self._execute() 15494 1726853348.49163: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.49166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.49186: variable 'omit' from source: magic vars 15494 1726853348.49458: variable 'ansible_distribution_major_version' from source: facts 15494 1726853348.49468: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853348.49551: variable 'profile_stat' from source: set_fact 15494 1726853348.49560: Evaluated conditional (profile_stat.stat.exists): False 15494 1726853348.49563: when evaluation is False, skipping this task 15494 1726853348.49565: _execute() done 15494 1726853348.49568: dumping result to json 15494 1726853348.49572: done dumping result, returning 15494 1726853348.49579: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [02083763-bbaf-0028-1a50-000000000274] 15494 1726853348.49583: sending task result for task 02083763-bbaf-0028-1a50-000000000274 15494 1726853348.49663: done sending task result for task 02083763-bbaf-0028-1a50-000000000274 15494 1726853348.49665: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15494 1726853348.49715: no more pending results, returning what we have 15494 1726853348.49718: results queue empty 15494 1726853348.49719: checking for any_errors_fatal 15494 1726853348.49726: done checking for any_errors_fatal 15494 1726853348.49727: checking for max_fail_percentage 15494 1726853348.49729: done checking for max_fail_percentage 15494 1726853348.49730: checking to see if all hosts have failed and the running result is not ok 15494 1726853348.49730: done checking to see if all hosts have failed 15494 1726853348.49731: getting the remaining hosts for this loop 15494 1726853348.49733: done getting the remaining hosts for this loop 15494 1726853348.49736: getting the next task for host managed_node1 15494 1726853348.49743: done getting next task for host managed_node1 15494 1726853348.49745: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15494 1726853348.49758: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853348.49761: getting variables 15494 1726853348.49762: in VariableManager get_vars() 15494 1726853348.49798: Calling all_inventory to load vars for managed_node1 15494 1726853348.49800: Calling groups_inventory to load vars for managed_node1 15494 1726853348.49804: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853348.49819: Calling all_plugins_play to load vars for managed_node1 15494 1726853348.49821: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853348.49825: Calling groups_plugins_play to load vars for managed_node1 15494 1726853348.50768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853348.52138: done with get_vars() 15494 1726853348.52165: done getting variables 15494 1726853348.52236: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15494 1726853348.52339: variable 'profile' from source: play vars 15494 1726853348.52343: variable 'interface' from source: set_fact 15494 1726853348.52389: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:29:08 -0400 (0:00:00.040) 0:00:17.140 ****** 15494 1726853348.52424: entering _queue_task() for managed_node1/set_fact 15494 1726853348.52746: worker is 1 (out of 1 available) 15494 1726853348.52759: exiting _queue_task() for managed_node1/set_fact 15494 1726853348.52774: done queuing things up, now waiting for results queue to drain 15494 1726853348.52776: waiting for pending results... 15494 1726853348.53023: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 15494 1726853348.53166: in run() - task 02083763-bbaf-0028-1a50-000000000275 15494 1726853348.53173: variable 'ansible_search_path' from source: unknown 15494 1726853348.53176: variable 'ansible_search_path' from source: unknown 15494 1726853348.53234: calling self._execute() 15494 1726853348.53337: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.53340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.53343: variable 'omit' from source: magic vars 15494 1726853348.53664: variable 'ansible_distribution_major_version' from source: facts 15494 1726853348.53699: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853348.53787: variable 'profile_stat' from source: set_fact 15494 1726853348.53794: Evaluated conditional (profile_stat.stat.exists): False 15494 1726853348.53797: when evaluation is False, skipping this task 15494 1726853348.53831: _execute() done 15494 1726853348.53834: dumping result to json 15494 1726853348.53836: done dumping result, returning 15494 1726853348.53838: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [02083763-bbaf-0028-1a50-000000000275] 15494 1726853348.53840: sending task result for task 02083763-bbaf-0028-1a50-000000000275 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15494 1726853348.53988: no more pending results, returning what we have 15494 1726853348.53992: results queue empty 15494 1726853348.53993: checking for any_errors_fatal 15494 1726853348.53998: done checking for any_errors_fatal 15494 1726853348.53999: checking for max_fail_percentage 15494 1726853348.54000: done checking for max_fail_percentage 15494 1726853348.54001: checking to see if all hosts have failed and the running result is not ok 15494 1726853348.54002: done checking to see if all hosts have failed 15494 1726853348.54003: getting the remaining hosts for this loop 15494 1726853348.54004: done getting the remaining hosts for this loop 15494 1726853348.54012: getting the next task for host managed_node1 15494 1726853348.54021: done getting next task for host managed_node1 15494 1726853348.54023: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 15494 1726853348.54026: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853348.54031: getting variables 15494 1726853348.54032: in VariableManager get_vars() 15494 1726853348.54060: Calling all_inventory to load vars for managed_node1 15494 1726853348.54062: Calling groups_inventory to load vars for managed_node1 15494 1726853348.54066: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853348.54080: Calling all_plugins_play to load vars for managed_node1 15494 1726853348.54083: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853348.54085: Calling groups_plugins_play to load vars for managed_node1 15494 1726853348.54682: done sending task result for task 02083763-bbaf-0028-1a50-000000000275 15494 1726853348.54686: WORKER PROCESS EXITING 15494 1726853348.54923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853348.55984: done with get_vars() 15494 1726853348.55998: done getting variables 15494 1726853348.56041: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15494 1726853348.56121: variable 'profile' from source: play vars 15494 1726853348.56124: variable 'interface' from source: set_fact 15494 1726853348.56164: variable 'interface' from source: set_fact TASK [Assert that the profile is present - 'LSR-TST-br31'] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:29:08 -0400 (0:00:00.037) 0:00:17.177 ****** 15494 1726853348.56188: entering _queue_task() for managed_node1/assert 15494 1726853348.56428: worker is 1 (out of 1 available) 15494 1726853348.56441: exiting _queue_task() for managed_node1/assert 15494 1726853348.56453: done queuing things up, now waiting for results queue to drain 15494 1726853348.56454: waiting for pending results... 15494 1726853348.56622: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'LSR-TST-br31' 15494 1726853348.56695: in run() - task 02083763-bbaf-0028-1a50-000000000260 15494 1726853348.56705: variable 'ansible_search_path' from source: unknown 15494 1726853348.56708: variable 'ansible_search_path' from source: unknown 15494 1726853348.56739: calling self._execute() 15494 1726853348.56824: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.56827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.56845: variable 'omit' from source: magic vars 15494 1726853348.57221: variable 'ansible_distribution_major_version' from source: facts 15494 1726853348.57230: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853348.57252: variable 'omit' from source: magic vars 15494 1726853348.57308: variable 'omit' from source: magic vars 15494 1726853348.57381: variable 'profile' from source: play vars 15494 1726853348.57385: variable 'interface' from source: set_fact 15494 1726853348.57429: variable 'interface' from source: set_fact 15494 1726853348.57442: variable 'omit' from source: magic vars 15494 1726853348.57481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853348.57507: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853348.57533: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853348.57546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853348.57560: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853348.57586: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853348.57589: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.57591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.57659: Set connection var ansible_connection to ssh 15494 1726853348.57662: Set connection var ansible_pipelining to False 15494 1726853348.57668: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853348.57674: Set connection var ansible_shell_type to sh 15494 1726853348.57676: Set connection var ansible_timeout to 10 15494 1726853348.57687: Set connection var ansible_shell_executable to /bin/sh 15494 1726853348.57702: variable 'ansible_shell_executable' from source: unknown 15494 1726853348.57705: variable 'ansible_connection' from source: unknown 15494 1726853348.57707: variable 'ansible_module_compression' from source: unknown 15494 1726853348.57710: variable 'ansible_shell_type' from source: unknown 15494 1726853348.57712: variable 'ansible_shell_executable' from source: unknown 15494 1726853348.57714: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.57719: variable 'ansible_pipelining' from source: unknown 15494 1726853348.57722: variable 'ansible_timeout' from source: unknown 15494 1726853348.57724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.57870: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853348.57879: variable 'omit' from source: magic vars 15494 1726853348.57888: starting attempt loop 15494 1726853348.57906: running the handler 15494 1726853348.57995: variable 'lsr_net_profile_exists' from source: set_fact 15494 1726853348.57999: Evaluated conditional (lsr_net_profile_exists): True 15494 1726853348.58001: handler run complete 15494 1726853348.58011: attempt loop complete, returning result 15494 1726853348.58014: _execute() done 15494 1726853348.58018: dumping result to json 15494 1726853348.58021: done dumping result, returning 15494 1726853348.58030: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'LSR-TST-br31' [02083763-bbaf-0028-1a50-000000000260] 15494 1726853348.58033: sending task result for task 02083763-bbaf-0028-1a50-000000000260 15494 1726853348.58152: done sending task result for task 02083763-bbaf-0028-1a50-000000000260 15494 1726853348.58155: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15494 1726853348.58215: no more pending results, returning what we have 15494 1726853348.58219: results queue empty 15494 1726853348.58220: checking for any_errors_fatal 15494 1726853348.58227: done checking for any_errors_fatal 15494 1726853348.58228: checking for max_fail_percentage 15494 1726853348.58229: done checking for max_fail_percentage 15494 1726853348.58230: checking to see if all hosts have failed and the running result is not ok 15494 1726853348.58231: done checking to see if all hosts have failed 15494 1726853348.58232: getting the remaining hosts for this loop 15494 1726853348.58233: done getting the remaining hosts for this loop 15494 1726853348.58237: getting the next task for host managed_node1 15494 1726853348.58244: done getting next task for host managed_node1 15494 1726853348.58247: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 15494 1726853348.58250: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853348.58253: getting variables 15494 1726853348.58254: in VariableManager get_vars() 15494 1726853348.58287: Calling all_inventory to load vars for managed_node1 15494 1726853348.58290: Calling groups_inventory to load vars for managed_node1 15494 1726853348.58294: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853348.58304: Calling all_plugins_play to load vars for managed_node1 15494 1726853348.58306: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853348.58308: Calling groups_plugins_play to load vars for managed_node1 15494 1726853348.59530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853348.60562: done with get_vars() 15494 1726853348.60580: done getting variables 15494 1726853348.60624: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15494 1726853348.60706: variable 'profile' from source: play vars 15494 1726853348.60709: variable 'interface' from source: set_fact 15494 1726853348.60747: variable 'interface' from source: set_fact TASK [Assert that the ansible managed comment is present in 'LSR-TST-br31'] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:29:08 -0400 (0:00:00.045) 0:00:17.223 ****** 15494 1726853348.60776: entering _queue_task() for managed_node1/assert 15494 1726853348.61015: worker is 1 (out of 1 available) 15494 1726853348.61029: exiting _queue_task() for managed_node1/assert 15494 1726853348.61042: done queuing things up, now waiting for results queue to drain 15494 1726853348.61043: waiting for pending results... 15494 1726853348.61216: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' 15494 1726853348.61284: in run() - task 02083763-bbaf-0028-1a50-000000000261 15494 1726853348.61295: variable 'ansible_search_path' from source: unknown 15494 1726853348.61299: variable 'ansible_search_path' from source: unknown 15494 1726853348.61326: calling self._execute() 15494 1726853348.61401: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.61405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.61414: variable 'omit' from source: magic vars 15494 1726853348.61689: variable 'ansible_distribution_major_version' from source: facts 15494 1726853348.61700: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853348.61705: variable 'omit' from source: magic vars 15494 1726853348.61736: variable 'omit' from source: magic vars 15494 1726853348.61806: variable 'profile' from source: play vars 15494 1726853348.61810: variable 'interface' from source: set_fact 15494 1726853348.61859: variable 'interface' from source: set_fact 15494 1726853348.61875: variable 'omit' from source: magic vars 15494 1726853348.61908: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853348.61937: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853348.61956: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853348.61969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853348.61981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853348.62004: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853348.62007: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.62011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.62084: Set connection var ansible_connection to ssh 15494 1726853348.62089: Set connection var ansible_pipelining to False 15494 1726853348.62095: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853348.62097: Set connection var ansible_shell_type to sh 15494 1726853348.62102: Set connection var ansible_timeout to 10 15494 1726853348.62109: Set connection var ansible_shell_executable to /bin/sh 15494 1726853348.62126: variable 'ansible_shell_executable' from source: unknown 15494 1726853348.62129: variable 'ansible_connection' from source: unknown 15494 1726853348.62131: variable 'ansible_module_compression' from source: unknown 15494 1726853348.62133: variable 'ansible_shell_type' from source: unknown 15494 1726853348.62136: variable 'ansible_shell_executable' from source: unknown 15494 1726853348.62138: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.62148: variable 'ansible_pipelining' from source: unknown 15494 1726853348.62150: variable 'ansible_timeout' from source: unknown 15494 1726853348.62153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.62252: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853348.62264: variable 'omit' from source: magic vars 15494 1726853348.62269: starting attempt loop 15494 1726853348.62274: running the handler 15494 1726853348.62345: variable 'lsr_net_profile_ansible_managed' from source: set_fact 15494 1726853348.62352: Evaluated conditional (lsr_net_profile_ansible_managed): True 15494 1726853348.62357: handler run complete 15494 1726853348.62375: attempt loop complete, returning result 15494 1726853348.62378: _execute() done 15494 1726853348.62381: dumping result to json 15494 1726853348.62383: done dumping result, returning 15494 1726853348.62389: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' [02083763-bbaf-0028-1a50-000000000261] 15494 1726853348.62393: sending task result for task 02083763-bbaf-0028-1a50-000000000261 15494 1726853348.62468: done sending task result for task 02083763-bbaf-0028-1a50-000000000261 15494 1726853348.62475: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15494 1726853348.62519: no more pending results, returning what we have 15494 1726853348.62522: results queue empty 15494 1726853348.62523: checking for any_errors_fatal 15494 1726853348.62529: done checking for any_errors_fatal 15494 1726853348.62530: checking for max_fail_percentage 15494 1726853348.62531: done checking for max_fail_percentage 15494 1726853348.62532: checking to see if all hosts have failed and the running result is not ok 15494 1726853348.62533: done checking to see if all hosts have failed 15494 1726853348.62534: getting the remaining hosts for this loop 15494 1726853348.62536: done getting the remaining hosts for this loop 15494 1726853348.62539: getting the next task for host managed_node1 15494 1726853348.62546: done getting next task for host managed_node1 15494 1726853348.62548: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 15494 1726853348.62551: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853348.62555: getting variables 15494 1726853348.62556: in VariableManager get_vars() 15494 1726853348.62584: Calling all_inventory to load vars for managed_node1 15494 1726853348.62586: Calling groups_inventory to load vars for managed_node1 15494 1726853348.62590: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853348.62600: Calling all_plugins_play to load vars for managed_node1 15494 1726853348.62602: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853348.62605: Calling groups_plugins_play to load vars for managed_node1 15494 1726853348.63405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853348.64375: done with get_vars() 15494 1726853348.64389: done getting variables 15494 1726853348.64430: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15494 1726853348.64508: variable 'profile' from source: play vars 15494 1726853348.64511: variable 'interface' from source: set_fact 15494 1726853348.64551: variable 'interface' from source: set_fact TASK [Assert that the fingerprint comment is present in LSR-TST-br31] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:29:08 -0400 (0:00:00.038) 0:00:17.261 ****** 15494 1726853348.64580: entering _queue_task() for managed_node1/assert 15494 1726853348.64798: worker is 1 (out of 1 available) 15494 1726853348.64811: exiting _queue_task() for managed_node1/assert 15494 1726853348.64824: done queuing things up, now waiting for results queue to drain 15494 1726853348.64826: waiting for pending results... 15494 1726853348.64995: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 15494 1726853348.65063: in run() - task 02083763-bbaf-0028-1a50-000000000262 15494 1726853348.65075: variable 'ansible_search_path' from source: unknown 15494 1726853348.65079: variable 'ansible_search_path' from source: unknown 15494 1726853348.65107: calling self._execute() 15494 1726853348.65180: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.65184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.65187: variable 'omit' from source: magic vars 15494 1726853348.65447: variable 'ansible_distribution_major_version' from source: facts 15494 1726853348.65458: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853348.65465: variable 'omit' from source: magic vars 15494 1726853348.65496: variable 'omit' from source: magic vars 15494 1726853348.65567: variable 'profile' from source: play vars 15494 1726853348.65573: variable 'interface' from source: set_fact 15494 1726853348.65617: variable 'interface' from source: set_fact 15494 1726853348.65633: variable 'omit' from source: magic vars 15494 1726853348.65666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853348.65693: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853348.65709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853348.65723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853348.65735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853348.65760: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853348.65763: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.65766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.65836: Set connection var ansible_connection to ssh 15494 1726853348.65840: Set connection var ansible_pipelining to False 15494 1726853348.65847: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853348.65852: Set connection var ansible_shell_type to sh 15494 1726853348.65857: Set connection var ansible_timeout to 10 15494 1726853348.65864: Set connection var ansible_shell_executable to /bin/sh 15494 1726853348.65883: variable 'ansible_shell_executable' from source: unknown 15494 1726853348.65885: variable 'ansible_connection' from source: unknown 15494 1726853348.65888: variable 'ansible_module_compression' from source: unknown 15494 1726853348.65890: variable 'ansible_shell_type' from source: unknown 15494 1726853348.65892: variable 'ansible_shell_executable' from source: unknown 15494 1726853348.65894: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.65898: variable 'ansible_pipelining' from source: unknown 15494 1726853348.65900: variable 'ansible_timeout' from source: unknown 15494 1726853348.65905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.66006: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853348.66015: variable 'omit' from source: magic vars 15494 1726853348.66022: starting attempt loop 15494 1726853348.66025: running the handler 15494 1726853348.66100: variable 'lsr_net_profile_fingerprint' from source: set_fact 15494 1726853348.66104: Evaluated conditional (lsr_net_profile_fingerprint): True 15494 1726853348.66109: handler run complete 15494 1726853348.66120: attempt loop complete, returning result 15494 1726853348.66123: _execute() done 15494 1726853348.66125: dumping result to json 15494 1726853348.66128: done dumping result, returning 15494 1726853348.66135: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 [02083763-bbaf-0028-1a50-000000000262] 15494 1726853348.66138: sending task result for task 02083763-bbaf-0028-1a50-000000000262 15494 1726853348.66216: done sending task result for task 02083763-bbaf-0028-1a50-000000000262 15494 1726853348.66219: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15494 1726853348.66293: no more pending results, returning what we have 15494 1726853348.66296: results queue empty 15494 1726853348.66297: checking for any_errors_fatal 15494 1726853348.66302: done checking for any_errors_fatal 15494 1726853348.66303: checking for max_fail_percentage 15494 1726853348.66305: done checking for max_fail_percentage 15494 1726853348.66306: checking to see if all hosts have failed and the running result is not ok 15494 1726853348.66307: done checking to see if all hosts have failed 15494 1726853348.66307: getting the remaining hosts for this loop 15494 1726853348.66309: done getting the remaining hosts for this loop 15494 1726853348.66312: getting the next task for host managed_node1 15494 1726853348.66320: done getting next task for host managed_node1 15494 1726853348.66322: ^ task is: TASK: meta (flush_handlers) 15494 1726853348.66324: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853348.66329: getting variables 15494 1726853348.66330: in VariableManager get_vars() 15494 1726853348.66352: Calling all_inventory to load vars for managed_node1 15494 1726853348.66355: Calling groups_inventory to load vars for managed_node1 15494 1726853348.66358: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853348.66366: Calling all_plugins_play to load vars for managed_node1 15494 1726853348.66368: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853348.66372: Calling groups_plugins_play to load vars for managed_node1 15494 1726853348.67250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853348.68427: done with get_vars() 15494 1726853348.68452: done getting variables 15494 1726853348.68533: in VariableManager get_vars() 15494 1726853348.68546: Calling all_inventory to load vars for managed_node1 15494 1726853348.68548: Calling groups_inventory to load vars for managed_node1 15494 1726853348.68552: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853348.68559: Calling all_plugins_play to load vars for managed_node1 15494 1726853348.68562: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853348.68565: Calling groups_plugins_play to load vars for managed_node1 15494 1726853348.69556: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853348.71027: done with get_vars() 15494 1726853348.71055: done queuing things up, now waiting for results queue to drain 15494 1726853348.71057: results queue empty 15494 1726853348.71058: checking for any_errors_fatal 15494 1726853348.71060: done checking for any_errors_fatal 15494 1726853348.71061: checking for max_fail_percentage 15494 1726853348.71062: done checking for max_fail_percentage 15494 1726853348.71068: checking to see if all hosts have failed and the running result is not ok 15494 1726853348.71069: done checking to see if all hosts have failed 15494 1726853348.71069: getting the remaining hosts for this loop 15494 1726853348.71070: done getting the remaining hosts for this loop 15494 1726853348.71075: getting the next task for host managed_node1 15494 1726853348.71079: done getting next task for host managed_node1 15494 1726853348.71080: ^ task is: TASK: meta (flush_handlers) 15494 1726853348.71082: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853348.71084: getting variables 15494 1726853348.71085: in VariableManager get_vars() 15494 1726853348.71093: Calling all_inventory to load vars for managed_node1 15494 1726853348.71095: Calling groups_inventory to load vars for managed_node1 15494 1726853348.71097: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853348.71102: Calling all_plugins_play to load vars for managed_node1 15494 1726853348.71104: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853348.71107: Calling groups_plugins_play to load vars for managed_node1 15494 1726853348.72251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853348.73788: done with get_vars() 15494 1726853348.73810: done getting variables 15494 1726853348.73861: in VariableManager get_vars() 15494 1726853348.73870: Calling all_inventory to load vars for managed_node1 15494 1726853348.73874: Calling groups_inventory to load vars for managed_node1 15494 1726853348.73876: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853348.73880: Calling all_plugins_play to load vars for managed_node1 15494 1726853348.73882: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853348.73885: Calling groups_plugins_play to load vars for managed_node1 15494 1726853348.74718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853348.75595: done with get_vars() 15494 1726853348.75612: done queuing things up, now waiting for results queue to drain 15494 1726853348.75614: results queue empty 15494 1726853348.75614: checking for any_errors_fatal 15494 1726853348.75616: done checking for any_errors_fatal 15494 1726853348.75616: checking for max_fail_percentage 15494 1726853348.75617: done checking for max_fail_percentage 15494 1726853348.75617: checking to see if all hosts have failed and the running result is not ok 15494 1726853348.75618: done checking to see if all hosts have failed 15494 1726853348.75618: getting the remaining hosts for this loop 15494 1726853348.75619: done getting the remaining hosts for this loop 15494 1726853348.75621: getting the next task for host managed_node1 15494 1726853348.75623: done getting next task for host managed_node1 15494 1726853348.75623: ^ task is: None 15494 1726853348.75624: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853348.75625: done queuing things up, now waiting for results queue to drain 15494 1726853348.75625: results queue empty 15494 1726853348.75626: checking for any_errors_fatal 15494 1726853348.75626: done checking for any_errors_fatal 15494 1726853348.75627: checking for max_fail_percentage 15494 1726853348.75627: done checking for max_fail_percentage 15494 1726853348.75628: checking to see if all hosts have failed and the running result is not ok 15494 1726853348.75628: done checking to see if all hosts have failed 15494 1726853348.75629: getting the next task for host managed_node1 15494 1726853348.75630: done getting next task for host managed_node1 15494 1726853348.75630: ^ task is: None 15494 1726853348.75631: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853348.75668: in VariableManager get_vars() 15494 1726853348.75685: done with get_vars() 15494 1726853348.75689: in VariableManager get_vars() 15494 1726853348.75697: done with get_vars() 15494 1726853348.75701: variable 'omit' from source: magic vars 15494 1726853348.75784: variable 'profile' from source: play vars 15494 1726853348.75867: in VariableManager get_vars() 15494 1726853348.75879: done with get_vars() 15494 1726853348.75893: variable 'omit' from source: magic vars 15494 1726853348.75937: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 15494 1726853348.76568: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15494 1726853348.76593: getting the remaining hosts for this loop 15494 1726853348.76594: done getting the remaining hosts for this loop 15494 1726853348.76597: getting the next task for host managed_node1 15494 1726853348.76599: done getting next task for host managed_node1 15494 1726853348.76601: ^ task is: TASK: Gathering Facts 15494 1726853348.76603: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853348.76605: getting variables 15494 1726853348.76605: in VariableManager get_vars() 15494 1726853348.76615: Calling all_inventory to load vars for managed_node1 15494 1726853348.76618: Calling groups_inventory to load vars for managed_node1 15494 1726853348.76620: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853348.76625: Calling all_plugins_play to load vars for managed_node1 15494 1726853348.76628: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853348.76630: Calling groups_plugins_play to load vars for managed_node1 15494 1726853348.77816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853348.79422: done with get_vars() 15494 1726853348.79443: done getting variables 15494 1726853348.79493: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 13:29:08 -0400 (0:00:00.149) 0:00:17.410 ****** 15494 1726853348.79518: entering _queue_task() for managed_node1/gather_facts 15494 1726853348.79856: worker is 1 (out of 1 available) 15494 1726853348.79867: exiting _queue_task() for managed_node1/gather_facts 15494 1726853348.79881: done queuing things up, now waiting for results queue to drain 15494 1726853348.79883: waiting for pending results... 15494 1726853348.80208: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15494 1726853348.80257: in run() - task 02083763-bbaf-0028-1a50-0000000002b5 15494 1726853348.80280: variable 'ansible_search_path' from source: unknown 15494 1726853348.80324: calling self._execute() 15494 1726853348.80422: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.80433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.80446: variable 'omit' from source: magic vars 15494 1726853348.80855: variable 'ansible_distribution_major_version' from source: facts 15494 1726853348.80873: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853348.80955: variable 'omit' from source: magic vars 15494 1726853348.80958: variable 'omit' from source: magic vars 15494 1726853348.80961: variable 'omit' from source: magic vars 15494 1726853348.80992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853348.81030: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853348.81060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853348.81086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853348.81101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853348.81133: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853348.81144: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.81173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.81260: Set connection var ansible_connection to ssh 15494 1726853348.81277: Set connection var ansible_pipelining to False 15494 1726853348.81476: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853348.81480: Set connection var ansible_shell_type to sh 15494 1726853348.81483: Set connection var ansible_timeout to 10 15494 1726853348.81486: Set connection var ansible_shell_executable to /bin/sh 15494 1726853348.81489: variable 'ansible_shell_executable' from source: unknown 15494 1726853348.81492: variable 'ansible_connection' from source: unknown 15494 1726853348.81495: variable 'ansible_module_compression' from source: unknown 15494 1726853348.81498: variable 'ansible_shell_type' from source: unknown 15494 1726853348.81501: variable 'ansible_shell_executable' from source: unknown 15494 1726853348.81504: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853348.81506: variable 'ansible_pipelining' from source: unknown 15494 1726853348.81509: variable 'ansible_timeout' from source: unknown 15494 1726853348.81512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853348.81582: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853348.81597: variable 'omit' from source: magic vars 15494 1726853348.81606: starting attempt loop 15494 1726853348.81612: running the handler 15494 1726853348.81639: variable 'ansible_facts' from source: unknown 15494 1726853348.81662: _low_level_execute_command(): starting 15494 1726853348.81675: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853348.82513: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853348.82519: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853348.82757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853348.84474: stdout chunk (state=3): >>>/root <<< 15494 1726853348.84604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853348.84622: stdout chunk (state=3): >>><<< 15494 1726853348.84636: stderr chunk (state=3): >>><<< 15494 1726853348.84664: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853348.84761: _low_level_execute_command(): starting 15494 1726853348.84766: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853348.8466985-16308-77133914360971 `" && echo ansible-tmp-1726853348.8466985-16308-77133914360971="` echo /root/.ansible/tmp/ansible-tmp-1726853348.8466985-16308-77133914360971 `" ) && sleep 0' 15494 1726853348.85376: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853348.85393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853348.85413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853348.85434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853348.85525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853348.85558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853348.85580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853348.85602: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853348.85680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853348.87634: stdout chunk (state=3): >>>ansible-tmp-1726853348.8466985-16308-77133914360971=/root/.ansible/tmp/ansible-tmp-1726853348.8466985-16308-77133914360971 <<< 15494 1726853348.87782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853348.87792: stdout chunk (state=3): >>><<< 15494 1726853348.87807: stderr chunk (state=3): >>><<< 15494 1726853348.87976: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853348.8466985-16308-77133914360971=/root/.ansible/tmp/ansible-tmp-1726853348.8466985-16308-77133914360971 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853348.87980: variable 'ansible_module_compression' from source: unknown 15494 1726853348.87982: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15494 1726853348.87994: variable 'ansible_facts' from source: unknown 15494 1726853348.88222: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853348.8466985-16308-77133914360971/AnsiballZ_setup.py 15494 1726853348.88450: Sending initial data 15494 1726853348.88453: Sent initial data (153 bytes) 15494 1726853348.89040: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853348.89058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853348.89078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853348.89189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853348.89205: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853348.89224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853348.89238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853348.89316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853348.90915: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15494 1726853348.90940: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853348.91007: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853348.91066: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpgn9_5d3n /root/.ansible/tmp/ansible-tmp-1726853348.8466985-16308-77133914360971/AnsiballZ_setup.py <<< 15494 1726853348.91070: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853348.8466985-16308-77133914360971/AnsiballZ_setup.py" <<< 15494 1726853348.91105: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpgn9_5d3n" to remote "/root/.ansible/tmp/ansible-tmp-1726853348.8466985-16308-77133914360971/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853348.8466985-16308-77133914360971/AnsiballZ_setup.py" <<< 15494 1726853348.92540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853348.92587: stderr chunk (state=3): >>><<< 15494 1726853348.92632: stdout chunk (state=3): >>><<< 15494 1726853348.92635: done transferring module to remote 15494 1726853348.92645: _low_level_execute_command(): starting 15494 1726853348.92657: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853348.8466985-16308-77133914360971/ /root/.ansible/tmp/ansible-tmp-1726853348.8466985-16308-77133914360971/AnsiballZ_setup.py && sleep 0' 15494 1726853348.93313: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853348.93330: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853348.93386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853348.93449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853348.93472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853348.93550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853348.95464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853348.95467: stdout chunk (state=3): >>><<< 15494 1726853348.95469: stderr chunk (state=3): >>><<< 15494 1726853348.95488: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853348.95496: _low_level_execute_command(): starting 15494 1726853348.95576: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853348.8466985-16308-77133914360971/AnsiballZ_setup.py && sleep 0' 15494 1726853348.96105: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853348.96119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853348.96133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853348.96153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853348.96170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853348.96184: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853348.96197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853348.96214: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853348.96292: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853348.96318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853348.96333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853348.96359: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853348.96440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853349.61590: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.6650390625, "5m": 0.3740234375, "15m": 0.162109375}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "09", "epoch": "1726853349", "epoch_int": "1726853349", "date": "2024-09-20", "time": "13:29:09", "iso8601_micro": "2024-09-20T17:29:09.247297Z", "iso8601": "2024-09-20T17:29:09Z", "iso8601_basic": "20240920T132909247<<< 15494 1726853349.61630: stdout chunk (state=3): >>>297", "iso8601_basic_short": "20240920T132909", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2951, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 580, "free": 2951}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 515, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797081088, "block_size": 4096, "block_total": 65519099, "block_available": 63915303, "block_used": 1603796, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["LSR-TST-br31", "eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "1e:33:24:48:88:76", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15494 1726853349.63631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853349.63729: stderr chunk (state=3): >>><<< 15494 1726853349.63732: stdout chunk (state=3): >>><<< 15494 1726853349.63735: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.6650390625, "5m": 0.3740234375, "15m": 0.162109375}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "09", "epoch": "1726853349", "epoch_int": "1726853349", "date": "2024-09-20", "time": "13:29:09", "iso8601_micro": "2024-09-20T17:29:09.247297Z", "iso8601": "2024-09-20T17:29:09Z", "iso8601_basic": "20240920T132909247297", "iso8601_basic_short": "20240920T132909", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2951, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 580, "free": 2951}, "nocache": {"free": 3288, "used": 243}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 515, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797081088, "block_size": 4096, "block_total": 65519099, "block_available": 63915303, "block_used": 1603796, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["LSR-TST-br31", "eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "1e:33:24:48:88:76", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853349.64154: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853348.8466985-16308-77133914360971/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853349.64193: _low_level_execute_command(): starting 15494 1726853349.64197: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853348.8466985-16308-77133914360971/ > /dev/null 2>&1 && sleep 0' 15494 1726853349.64702: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853349.64705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853349.64707: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853349.64709: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853349.64712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853349.64767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853349.64774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853349.64817: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853349.66624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853349.66653: stderr chunk (state=3): >>><<< 15494 1726853349.66656: stdout chunk (state=3): >>><<< 15494 1726853349.66670: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853349.66678: handler run complete 15494 1726853349.66786: variable 'ansible_facts' from source: unknown 15494 1726853349.66847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853349.67041: variable 'ansible_facts' from source: unknown 15494 1726853349.67100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853349.67180: attempt loop complete, returning result 15494 1726853349.67183: _execute() done 15494 1726853349.67186: dumping result to json 15494 1726853349.67209: done dumping result, returning 15494 1726853349.67217: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-0028-1a50-0000000002b5] 15494 1726853349.67222: sending task result for task 02083763-bbaf-0028-1a50-0000000002b5 ok: [managed_node1] 15494 1726853349.67802: no more pending results, returning what we have 15494 1726853349.67804: results queue empty 15494 1726853349.67805: checking for any_errors_fatal 15494 1726853349.67806: done checking for any_errors_fatal 15494 1726853349.67806: checking for max_fail_percentage 15494 1726853349.67807: done checking for max_fail_percentage 15494 1726853349.67808: checking to see if all hosts have failed and the running result is not ok 15494 1726853349.67808: done checking to see if all hosts have failed 15494 1726853349.67808: getting the remaining hosts for this loop 15494 1726853349.67809: done getting the remaining hosts for this loop 15494 1726853349.67812: getting the next task for host managed_node1 15494 1726853349.67815: done getting next task for host managed_node1 15494 1726853349.67816: ^ task is: TASK: meta (flush_handlers) 15494 1726853349.67817: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853349.67820: getting variables 15494 1726853349.67821: in VariableManager get_vars() 15494 1726853349.67842: Calling all_inventory to load vars for managed_node1 15494 1726853349.67843: Calling groups_inventory to load vars for managed_node1 15494 1726853349.67845: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853349.67854: Calling all_plugins_play to load vars for managed_node1 15494 1726853349.67855: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853349.67858: Calling groups_plugins_play to load vars for managed_node1 15494 1726853349.68381: done sending task result for task 02083763-bbaf-0028-1a50-0000000002b5 15494 1726853349.68386: WORKER PROCESS EXITING 15494 1726853349.69049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853349.70325: done with get_vars() 15494 1726853349.70352: done getting variables 15494 1726853349.70426: in VariableManager get_vars() 15494 1726853349.70436: Calling all_inventory to load vars for managed_node1 15494 1726853349.70437: Calling groups_inventory to load vars for managed_node1 15494 1726853349.70439: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853349.70442: Calling all_plugins_play to load vars for managed_node1 15494 1726853349.70443: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853349.70445: Calling groups_plugins_play to load vars for managed_node1 15494 1726853349.74528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853349.75678: done with get_vars() 15494 1726853349.75707: done queuing things up, now waiting for results queue to drain 15494 1726853349.75710: results queue empty 15494 1726853349.75710: checking for any_errors_fatal 15494 1726853349.75714: done checking for any_errors_fatal 15494 1726853349.75715: checking for max_fail_percentage 15494 1726853349.75721: done checking for max_fail_percentage 15494 1726853349.75722: checking to see if all hosts have failed and the running result is not ok 15494 1726853349.75723: done checking to see if all hosts have failed 15494 1726853349.75724: getting the remaining hosts for this loop 15494 1726853349.75725: done getting the remaining hosts for this loop 15494 1726853349.75727: getting the next task for host managed_node1 15494 1726853349.75731: done getting next task for host managed_node1 15494 1726853349.75733: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15494 1726853349.75734: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853349.75741: getting variables 15494 1726853349.75742: in VariableManager get_vars() 15494 1726853349.75754: Calling all_inventory to load vars for managed_node1 15494 1726853349.75755: Calling groups_inventory to load vars for managed_node1 15494 1726853349.75756: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853349.75760: Calling all_plugins_play to load vars for managed_node1 15494 1726853349.75761: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853349.75763: Calling groups_plugins_play to load vars for managed_node1 15494 1726853349.76670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853349.77902: done with get_vars() 15494 1726853349.77915: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:29:09 -0400 (0:00:00.984) 0:00:18.395 ****** 15494 1726853349.77969: entering _queue_task() for managed_node1/include_tasks 15494 1726853349.78383: worker is 1 (out of 1 available) 15494 1726853349.78397: exiting _queue_task() for managed_node1/include_tasks 15494 1726853349.78409: done queuing things up, now waiting for results queue to drain 15494 1726853349.78410: waiting for pending results... 15494 1726853349.78689: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15494 1726853349.78824: in run() - task 02083763-bbaf-0028-1a50-00000000003a 15494 1726853349.78830: variable 'ansible_search_path' from source: unknown 15494 1726853349.78833: variable 'ansible_search_path' from source: unknown 15494 1726853349.78897: calling self._execute() 15494 1726853349.78955: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853349.78959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853349.79030: variable 'omit' from source: magic vars 15494 1726853349.79278: variable 'ansible_distribution_major_version' from source: facts 15494 1726853349.79288: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853349.79295: _execute() done 15494 1726853349.79298: dumping result to json 15494 1726853349.79301: done dumping result, returning 15494 1726853349.79308: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-0028-1a50-00000000003a] 15494 1726853349.79313: sending task result for task 02083763-bbaf-0028-1a50-00000000003a 15494 1726853349.79405: done sending task result for task 02083763-bbaf-0028-1a50-00000000003a 15494 1726853349.79408: WORKER PROCESS EXITING 15494 1726853349.79460: no more pending results, returning what we have 15494 1726853349.79465: in VariableManager get_vars() 15494 1726853349.79507: Calling all_inventory to load vars for managed_node1 15494 1726853349.79509: Calling groups_inventory to load vars for managed_node1 15494 1726853349.79512: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853349.79522: Calling all_plugins_play to load vars for managed_node1 15494 1726853349.79525: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853349.79527: Calling groups_plugins_play to load vars for managed_node1 15494 1726853349.80581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853349.81709: done with get_vars() 15494 1726853349.81722: variable 'ansible_search_path' from source: unknown 15494 1726853349.81723: variable 'ansible_search_path' from source: unknown 15494 1726853349.81744: we have included files to process 15494 1726853349.81745: generating all_blocks data 15494 1726853349.81748: done generating all_blocks data 15494 1726853349.81749: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15494 1726853349.81749: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15494 1726853349.81751: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15494 1726853349.82209: done processing included file 15494 1726853349.82211: iterating over new_blocks loaded from include file 15494 1726853349.82212: in VariableManager get_vars() 15494 1726853349.82232: done with get_vars() 15494 1726853349.82234: filtering new block on tags 15494 1726853349.82248: done filtering new block on tags 15494 1726853349.82250: in VariableManager get_vars() 15494 1726853349.82263: done with get_vars() 15494 1726853349.82264: filtering new block on tags 15494 1726853349.82277: done filtering new block on tags 15494 1726853349.82279: in VariableManager get_vars() 15494 1726853349.82293: done with get_vars() 15494 1726853349.82294: filtering new block on tags 15494 1726853349.82302: done filtering new block on tags 15494 1726853349.82304: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 15494 1726853349.82307: extending task lists for all hosts with included blocks 15494 1726853349.82654: done extending task lists 15494 1726853349.82655: done processing included files 15494 1726853349.82656: results queue empty 15494 1726853349.82657: checking for any_errors_fatal 15494 1726853349.82661: done checking for any_errors_fatal 15494 1726853349.82662: checking for max_fail_percentage 15494 1726853349.82663: done checking for max_fail_percentage 15494 1726853349.82664: checking to see if all hosts have failed and the running result is not ok 15494 1726853349.82665: done checking to see if all hosts have failed 15494 1726853349.82666: getting the remaining hosts for this loop 15494 1726853349.82667: done getting the remaining hosts for this loop 15494 1726853349.82669: getting the next task for host managed_node1 15494 1726853349.82675: done getting next task for host managed_node1 15494 1726853349.82677: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15494 1726853349.82679: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853349.82688: getting variables 15494 1726853349.82688: in VariableManager get_vars() 15494 1726853349.82697: Calling all_inventory to load vars for managed_node1 15494 1726853349.82699: Calling groups_inventory to load vars for managed_node1 15494 1726853349.82700: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853349.82705: Calling all_plugins_play to load vars for managed_node1 15494 1726853349.82708: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853349.82715: Calling groups_plugins_play to load vars for managed_node1 15494 1726853349.83494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853349.84505: done with get_vars() 15494 1726853349.84517: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:29:09 -0400 (0:00:00.065) 0:00:18.461 ****** 15494 1726853349.84566: entering _queue_task() for managed_node1/setup 15494 1726853349.84819: worker is 1 (out of 1 available) 15494 1726853349.84830: exiting _queue_task() for managed_node1/setup 15494 1726853349.84843: done queuing things up, now waiting for results queue to drain 15494 1726853349.84843: waiting for pending results... 15494 1726853349.85018: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15494 1726853349.85103: in run() - task 02083763-bbaf-0028-1a50-0000000002f6 15494 1726853349.85129: variable 'ansible_search_path' from source: unknown 15494 1726853349.85132: variable 'ansible_search_path' from source: unknown 15494 1726853349.85177: calling self._execute() 15494 1726853349.85251: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853349.85256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853349.85265: variable 'omit' from source: magic vars 15494 1726853349.85596: variable 'ansible_distribution_major_version' from source: facts 15494 1726853349.85600: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853349.85755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853349.87479: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853349.87536: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853349.87564: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853349.87590: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853349.87611: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853349.87668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853349.87690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853349.87712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853349.87737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853349.87750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853349.87804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853349.87824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853349.87861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853349.87884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853349.87942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853349.88119: variable '__network_required_facts' from source: role '' defaults 15494 1726853349.88125: variable 'ansible_facts' from source: unknown 15494 1726853349.88673: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15494 1726853349.88690: when evaluation is False, skipping this task 15494 1726853349.88693: _execute() done 15494 1726853349.88696: dumping result to json 15494 1726853349.88699: done dumping result, returning 15494 1726853349.88701: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-0028-1a50-0000000002f6] 15494 1726853349.88704: sending task result for task 02083763-bbaf-0028-1a50-0000000002f6 15494 1726853349.88803: done sending task result for task 02083763-bbaf-0028-1a50-0000000002f6 15494 1726853349.88810: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15494 1726853349.88887: no more pending results, returning what we have 15494 1726853349.88891: results queue empty 15494 1726853349.88891: checking for any_errors_fatal 15494 1726853349.88893: done checking for any_errors_fatal 15494 1726853349.88893: checking for max_fail_percentage 15494 1726853349.88895: done checking for max_fail_percentage 15494 1726853349.88895: checking to see if all hosts have failed and the running result is not ok 15494 1726853349.88896: done checking to see if all hosts have failed 15494 1726853349.88897: getting the remaining hosts for this loop 15494 1726853349.88898: done getting the remaining hosts for this loop 15494 1726853349.88901: getting the next task for host managed_node1 15494 1726853349.88909: done getting next task for host managed_node1 15494 1726853349.88912: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15494 1726853349.88915: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853349.88928: getting variables 15494 1726853349.88931: in VariableManager get_vars() 15494 1726853349.88962: Calling all_inventory to load vars for managed_node1 15494 1726853349.88965: Calling groups_inventory to load vars for managed_node1 15494 1726853349.88966: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853349.88977: Calling all_plugins_play to load vars for managed_node1 15494 1726853349.88979: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853349.88982: Calling groups_plugins_play to load vars for managed_node1 15494 1726853349.89959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853349.90934: done with get_vars() 15494 1726853349.90951: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:29:09 -0400 (0:00:00.064) 0:00:18.525 ****** 15494 1726853349.91017: entering _queue_task() for managed_node1/stat 15494 1726853349.91243: worker is 1 (out of 1 available) 15494 1726853349.91258: exiting _queue_task() for managed_node1/stat 15494 1726853349.91269: done queuing things up, now waiting for results queue to drain 15494 1726853349.91270: waiting for pending results... 15494 1726853349.91455: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15494 1726853349.91552: in run() - task 02083763-bbaf-0028-1a50-0000000002f8 15494 1726853349.91561: variable 'ansible_search_path' from source: unknown 15494 1726853349.91565: variable 'ansible_search_path' from source: unknown 15494 1726853349.91595: calling self._execute() 15494 1726853349.91664: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853349.91668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853349.91679: variable 'omit' from source: magic vars 15494 1726853349.92006: variable 'ansible_distribution_major_version' from source: facts 15494 1726853349.92009: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853349.92293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853349.92473: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853349.92509: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853349.92533: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853349.92558: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853349.92775: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853349.92855: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853349.92876: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853349.92887: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853349.93017: variable '__network_is_ostree' from source: set_fact 15494 1726853349.93021: Evaluated conditional (not __network_is_ostree is defined): False 15494 1726853349.93024: when evaluation is False, skipping this task 15494 1726853349.93031: _execute() done 15494 1726853349.93034: dumping result to json 15494 1726853349.93039: done dumping result, returning 15494 1726853349.93042: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-0028-1a50-0000000002f8] 15494 1726853349.93048: sending task result for task 02083763-bbaf-0028-1a50-0000000002f8 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15494 1726853349.93344: no more pending results, returning what we have 15494 1726853349.93349: results queue empty 15494 1726853349.93350: checking for any_errors_fatal 15494 1726853349.93355: done checking for any_errors_fatal 15494 1726853349.93356: checking for max_fail_percentage 15494 1726853349.93357: done checking for max_fail_percentage 15494 1726853349.93357: checking to see if all hosts have failed and the running result is not ok 15494 1726853349.93358: done checking to see if all hosts have failed 15494 1726853349.93359: getting the remaining hosts for this loop 15494 1726853349.93360: done getting the remaining hosts for this loop 15494 1726853349.93364: getting the next task for host managed_node1 15494 1726853349.93372: done getting next task for host managed_node1 15494 1726853349.93376: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15494 1726853349.93378: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853349.93391: getting variables 15494 1726853349.93392: in VariableManager get_vars() 15494 1726853349.93424: Calling all_inventory to load vars for managed_node1 15494 1726853349.93427: Calling groups_inventory to load vars for managed_node1 15494 1726853349.93429: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853349.93439: Calling all_plugins_play to load vars for managed_node1 15494 1726853349.93443: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853349.93449: Calling groups_plugins_play to load vars for managed_node1 15494 1726853349.93459: done sending task result for task 02083763-bbaf-0028-1a50-0000000002f8 15494 1726853349.93462: WORKER PROCESS EXITING 15494 1726853349.94746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853349.96336: done with get_vars() 15494 1726853349.96360: done getting variables 15494 1726853349.96420: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:29:09 -0400 (0:00:00.054) 0:00:18.580 ****** 15494 1726853349.96456: entering _queue_task() for managed_node1/set_fact 15494 1726853349.96774: worker is 1 (out of 1 available) 15494 1726853349.96786: exiting _queue_task() for managed_node1/set_fact 15494 1726853349.96799: done queuing things up, now waiting for results queue to drain 15494 1726853349.96800: waiting for pending results... 15494 1726853349.97191: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15494 1726853349.97235: in run() - task 02083763-bbaf-0028-1a50-0000000002f9 15494 1726853349.97260: variable 'ansible_search_path' from source: unknown 15494 1726853349.97269: variable 'ansible_search_path' from source: unknown 15494 1726853349.97315: calling self._execute() 15494 1726853349.97433: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853349.97444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853349.97462: variable 'omit' from source: magic vars 15494 1726853349.97850: variable 'ansible_distribution_major_version' from source: facts 15494 1726853349.97868: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853349.98033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853349.98313: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853349.98378: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853349.98416: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853349.98454: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853349.98603: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853349.98633: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853349.98664: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853349.98700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853349.98790: variable '__network_is_ostree' from source: set_fact 15494 1726853349.98806: Evaluated conditional (not __network_is_ostree is defined): False 15494 1726853349.98909: when evaluation is False, skipping this task 15494 1726853349.98912: _execute() done 15494 1726853349.98916: dumping result to json 15494 1726853349.98918: done dumping result, returning 15494 1726853349.98921: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-0028-1a50-0000000002f9] 15494 1726853349.98923: sending task result for task 02083763-bbaf-0028-1a50-0000000002f9 15494 1726853349.98990: done sending task result for task 02083763-bbaf-0028-1a50-0000000002f9 15494 1726853349.98993: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15494 1726853349.99062: no more pending results, returning what we have 15494 1726853349.99066: results queue empty 15494 1726853349.99067: checking for any_errors_fatal 15494 1726853349.99078: done checking for any_errors_fatal 15494 1726853349.99079: checking for max_fail_percentage 15494 1726853349.99081: done checking for max_fail_percentage 15494 1726853349.99082: checking to see if all hosts have failed and the running result is not ok 15494 1726853349.99083: done checking to see if all hosts have failed 15494 1726853349.99084: getting the remaining hosts for this loop 15494 1726853349.99086: done getting the remaining hosts for this loop 15494 1726853349.99091: getting the next task for host managed_node1 15494 1726853349.99100: done getting next task for host managed_node1 15494 1726853349.99105: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15494 1726853349.99108: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853349.99122: getting variables 15494 1726853349.99124: in VariableManager get_vars() 15494 1726853349.99164: Calling all_inventory to load vars for managed_node1 15494 1726853349.99167: Calling groups_inventory to load vars for managed_node1 15494 1726853349.99169: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853349.99287: Calling all_plugins_play to load vars for managed_node1 15494 1726853349.99291: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853349.99294: Calling groups_plugins_play to load vars for managed_node1 15494 1726853350.00807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853350.02438: done with get_vars() 15494 1726853350.02474: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:29:10 -0400 (0:00:00.061) 0:00:18.641 ****** 15494 1726853350.02577: entering _queue_task() for managed_node1/service_facts 15494 1726853350.02933: worker is 1 (out of 1 available) 15494 1726853350.02945: exiting _queue_task() for managed_node1/service_facts 15494 1726853350.02960: done queuing things up, now waiting for results queue to drain 15494 1726853350.02961: waiting for pending results... 15494 1726853350.03243: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 15494 1726853350.03579: in run() - task 02083763-bbaf-0028-1a50-0000000002fb 15494 1726853350.03582: variable 'ansible_search_path' from source: unknown 15494 1726853350.03586: variable 'ansible_search_path' from source: unknown 15494 1726853350.03589: calling self._execute() 15494 1726853350.03592: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853350.03595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853350.03598: variable 'omit' from source: magic vars 15494 1726853350.03968: variable 'ansible_distribution_major_version' from source: facts 15494 1726853350.03992: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853350.04003: variable 'omit' from source: magic vars 15494 1726853350.04070: variable 'omit' from source: magic vars 15494 1726853350.04114: variable 'omit' from source: magic vars 15494 1726853350.04164: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853350.04206: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853350.04238: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853350.04263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853350.04282: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853350.04317: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853350.04326: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853350.04337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853350.04442: Set connection var ansible_connection to ssh 15494 1726853350.04461: Set connection var ansible_pipelining to False 15494 1726853350.04474: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853350.04560: Set connection var ansible_shell_type to sh 15494 1726853350.04563: Set connection var ansible_timeout to 10 15494 1726853350.04566: Set connection var ansible_shell_executable to /bin/sh 15494 1726853350.04569: variable 'ansible_shell_executable' from source: unknown 15494 1726853350.04573: variable 'ansible_connection' from source: unknown 15494 1726853350.04576: variable 'ansible_module_compression' from source: unknown 15494 1726853350.04577: variable 'ansible_shell_type' from source: unknown 15494 1726853350.04580: variable 'ansible_shell_executable' from source: unknown 15494 1726853350.04581: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853350.04583: variable 'ansible_pipelining' from source: unknown 15494 1726853350.04585: variable 'ansible_timeout' from source: unknown 15494 1726853350.04587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853350.04765: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853350.04787: variable 'omit' from source: magic vars 15494 1726853350.04798: starting attempt loop 15494 1726853350.04806: running the handler 15494 1726853350.04824: _low_level_execute_command(): starting 15494 1726853350.04837: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853350.05591: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853350.05607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853350.05622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853350.05649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853350.05756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853350.05789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853350.05867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853350.07627: stdout chunk (state=3): >>>/root <<< 15494 1726853350.07904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853350.07912: stdout chunk (state=3): >>><<< 15494 1726853350.07918: stderr chunk (state=3): >>><<< 15494 1726853350.08174: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853350.08177: _low_level_execute_command(): starting 15494 1726853350.08184: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853350.0801451-16345-258463046754596 `" && echo ansible-tmp-1726853350.0801451-16345-258463046754596="` echo /root/.ansible/tmp/ansible-tmp-1726853350.0801451-16345-258463046754596 `" ) && sleep 0' 15494 1726853350.08847: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853350.08960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853350.09007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853350.09138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853350.09284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853350.11075: stdout chunk (state=3): >>>ansible-tmp-1726853350.0801451-16345-258463046754596=/root/.ansible/tmp/ansible-tmp-1726853350.0801451-16345-258463046754596 <<< 15494 1726853350.11190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853350.11240: stderr chunk (state=3): >>><<< 15494 1726853350.11249: stdout chunk (state=3): >>><<< 15494 1726853350.11273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853350.0801451-16345-258463046754596=/root/.ansible/tmp/ansible-tmp-1726853350.0801451-16345-258463046754596 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853350.11476: variable 'ansible_module_compression' from source: unknown 15494 1726853350.11480: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15494 1726853350.11482: variable 'ansible_facts' from source: unknown 15494 1726853350.11497: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853350.0801451-16345-258463046754596/AnsiballZ_service_facts.py 15494 1726853350.11732: Sending initial data 15494 1726853350.11735: Sent initial data (162 bytes) 15494 1726853350.12304: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853350.12320: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853350.12381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853350.12453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853350.12484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853350.12506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853350.12565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853350.14202: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853350.14245: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853350.14279: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpj_qnqb2n /root/.ansible/tmp/ansible-tmp-1726853350.0801451-16345-258463046754596/AnsiballZ_service_facts.py <<< 15494 1726853350.14303: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853350.0801451-16345-258463046754596/AnsiballZ_service_facts.py" <<< 15494 1726853350.14355: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpj_qnqb2n" to remote "/root/.ansible/tmp/ansible-tmp-1726853350.0801451-16345-258463046754596/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853350.0801451-16345-258463046754596/AnsiballZ_service_facts.py" <<< 15494 1726853350.16077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853350.16081: stdout chunk (state=3): >>><<< 15494 1726853350.16083: stderr chunk (state=3): >>><<< 15494 1726853350.16132: done transferring module to remote 15494 1726853350.16135: _low_level_execute_command(): starting 15494 1726853350.16137: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853350.0801451-16345-258463046754596/ /root/.ansible/tmp/ansible-tmp-1726853350.0801451-16345-258463046754596/AnsiballZ_service_facts.py && sleep 0' 15494 1726853350.17304: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853350.17576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853350.17592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853350.17787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853350.17854: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853350.19692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853350.19701: stdout chunk (state=3): >>><<< 15494 1726853350.19704: stderr chunk (state=3): >>><<< 15494 1726853350.19798: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853350.19801: _low_level_execute_command(): starting 15494 1726853350.19804: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853350.0801451-16345-258463046754596/AnsiballZ_service_facts.py && sleep 0' 15494 1726853350.20796: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853350.21084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853350.21109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853350.21126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853350.21211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853351.72530: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 15494 1726853351.72582: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 15494 1726853351.72627: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15494 1726853351.74279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853351.74283: stderr chunk (state=3): >>><<< 15494 1726853351.74286: stdout chunk (state=3): >>><<< 15494 1726853351.74294: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853351.75257: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853350.0801451-16345-258463046754596/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853351.75282: _low_level_execute_command(): starting 15494 1726853351.75306: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853350.0801451-16345-258463046754596/ > /dev/null 2>&1 && sleep 0' 15494 1726853351.76079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853351.76117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853351.76142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853351.76155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853351.76232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853351.78108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853351.78112: stdout chunk (state=3): >>><<< 15494 1726853351.78114: stderr chunk (state=3): >>><<< 15494 1726853351.78145: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853351.78163: handler run complete 15494 1726853351.78394: variable 'ansible_facts' from source: unknown 15494 1726853351.78565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853351.79177: variable 'ansible_facts' from source: unknown 15494 1726853351.79283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853351.79523: attempt loop complete, returning result 15494 1726853351.79549: _execute() done 15494 1726853351.79563: dumping result to json 15494 1726853351.79643: done dumping result, returning 15494 1726853351.79678: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-0028-1a50-0000000002fb] 15494 1726853351.79693: sending task result for task 02083763-bbaf-0028-1a50-0000000002fb 15494 1726853351.81158: done sending task result for task 02083763-bbaf-0028-1a50-0000000002fb 15494 1726853351.81161: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15494 1726853351.81363: no more pending results, returning what we have 15494 1726853351.81366: results queue empty 15494 1726853351.81367: checking for any_errors_fatal 15494 1726853351.81370: done checking for any_errors_fatal 15494 1726853351.81372: checking for max_fail_percentage 15494 1726853351.81373: done checking for max_fail_percentage 15494 1726853351.81374: checking to see if all hosts have failed and the running result is not ok 15494 1726853351.81375: done checking to see if all hosts have failed 15494 1726853351.81376: getting the remaining hosts for this loop 15494 1726853351.81377: done getting the remaining hosts for this loop 15494 1726853351.81380: getting the next task for host managed_node1 15494 1726853351.81384: done getting next task for host managed_node1 15494 1726853351.81387: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15494 1726853351.81390: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853351.81398: getting variables 15494 1726853351.81400: in VariableManager get_vars() 15494 1726853351.81426: Calling all_inventory to load vars for managed_node1 15494 1726853351.81428: Calling groups_inventory to load vars for managed_node1 15494 1726853351.81431: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853351.81438: Calling all_plugins_play to load vars for managed_node1 15494 1726853351.81441: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853351.81445: Calling groups_plugins_play to load vars for managed_node1 15494 1726853351.83350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853351.85063: done with get_vars() 15494 1726853351.85094: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:29:11 -0400 (0:00:01.826) 0:00:20.467 ****** 15494 1726853351.85180: entering _queue_task() for managed_node1/package_facts 15494 1726853351.86167: worker is 1 (out of 1 available) 15494 1726853351.86183: exiting _queue_task() for managed_node1/package_facts 15494 1726853351.86194: done queuing things up, now waiting for results queue to drain 15494 1726853351.86195: waiting for pending results... 15494 1726853351.86691: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15494 1726853351.86698: in run() - task 02083763-bbaf-0028-1a50-0000000002fc 15494 1726853351.86701: variable 'ansible_search_path' from source: unknown 15494 1726853351.86704: variable 'ansible_search_path' from source: unknown 15494 1726853351.86706: calling self._execute() 15494 1726853351.86725: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853351.86736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853351.86749: variable 'omit' from source: magic vars 15494 1726853351.87104: variable 'ansible_distribution_major_version' from source: facts 15494 1726853351.87120: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853351.87129: variable 'omit' from source: magic vars 15494 1726853351.87193: variable 'omit' from source: magic vars 15494 1726853351.87232: variable 'omit' from source: magic vars 15494 1726853351.87275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853351.87376: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853351.87379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853351.87382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853351.87384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853351.87399: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853351.87407: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853351.87413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853351.87511: Set connection var ansible_connection to ssh 15494 1726853351.87524: Set connection var ansible_pipelining to False 15494 1726853351.87534: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853351.87540: Set connection var ansible_shell_type to sh 15494 1726853351.87548: Set connection var ansible_timeout to 10 15494 1726853351.87559: Set connection var ansible_shell_executable to /bin/sh 15494 1726853351.87587: variable 'ansible_shell_executable' from source: unknown 15494 1726853351.87594: variable 'ansible_connection' from source: unknown 15494 1726853351.87776: variable 'ansible_module_compression' from source: unknown 15494 1726853351.87779: variable 'ansible_shell_type' from source: unknown 15494 1726853351.87782: variable 'ansible_shell_executable' from source: unknown 15494 1726853351.87784: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853351.87786: variable 'ansible_pipelining' from source: unknown 15494 1726853351.87788: variable 'ansible_timeout' from source: unknown 15494 1726853351.87790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853351.87819: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853351.87839: variable 'omit' from source: magic vars 15494 1726853351.87850: starting attempt loop 15494 1726853351.87860: running the handler 15494 1726853351.87881: _low_level_execute_command(): starting 15494 1726853351.87893: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853351.88578: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853351.88594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853351.88604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853351.88621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853351.88638: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853351.88692: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853351.88747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853351.88765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853351.88990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853351.89045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853351.90745: stdout chunk (state=3): >>>/root <<< 15494 1726853351.90975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853351.90979: stdout chunk (state=3): >>><<< 15494 1726853351.90983: stderr chunk (state=3): >>><<< 15494 1726853351.91006: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853351.91027: _low_level_execute_command(): starting 15494 1726853351.91063: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853351.91013-16403-74502328419672 `" && echo ansible-tmp-1726853351.91013-16403-74502328419672="` echo /root/.ansible/tmp/ansible-tmp-1726853351.91013-16403-74502328419672 `" ) && sleep 0' 15494 1726853351.92487: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853351.92816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853351.92884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853351.94774: stdout chunk (state=3): >>>ansible-tmp-1726853351.91013-16403-74502328419672=/root/.ansible/tmp/ansible-tmp-1726853351.91013-16403-74502328419672 <<< 15494 1726853351.94881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853351.94909: stderr chunk (state=3): >>><<< 15494 1726853351.94918: stdout chunk (state=3): >>><<< 15494 1726853351.94941: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853351.91013-16403-74502328419672=/root/.ansible/tmp/ansible-tmp-1726853351.91013-16403-74502328419672 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853351.95202: variable 'ansible_module_compression' from source: unknown 15494 1726853351.95206: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15494 1726853351.95306: variable 'ansible_facts' from source: unknown 15494 1726853351.95683: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853351.91013-16403-74502328419672/AnsiballZ_package_facts.py 15494 1726853351.95985: Sending initial data 15494 1726853351.95996: Sent initial data (159 bytes) 15494 1726853351.97359: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853351.97375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853351.97450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853351.97470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853351.97688: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853351.97872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853351.97901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853351.99442: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15494 1726853351.99449: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853351.99511: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853351.99612: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpcb1m6gtu /root/.ansible/tmp/ansible-tmp-1726853351.91013-16403-74502328419672/AnsiballZ_package_facts.py <<< 15494 1726853351.99615: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853351.91013-16403-74502328419672/AnsiballZ_package_facts.py" <<< 15494 1726853351.99692: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpcb1m6gtu" to remote "/root/.ansible/tmp/ansible-tmp-1726853351.91013-16403-74502328419672/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853351.91013-16403-74502328419672/AnsiballZ_package_facts.py" <<< 15494 1726853352.02206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853352.02561: stderr chunk (state=3): >>><<< 15494 1726853352.02566: stdout chunk (state=3): >>><<< 15494 1726853352.02568: done transferring module to remote 15494 1726853352.02577: _low_level_execute_command(): starting 15494 1726853352.02579: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853351.91013-16403-74502328419672/ /root/.ansible/tmp/ansible-tmp-1726853351.91013-16403-74502328419672/AnsiballZ_package_facts.py && sleep 0' 15494 1726853352.03732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853352.03735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853352.03738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853352.03740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853352.03742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853352.03985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853352.03996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853352.04052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853352.05897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853352.06050: stderr chunk (state=3): >>><<< 15494 1726853352.06054: stdout chunk (state=3): >>><<< 15494 1726853352.06087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853352.06092: _low_level_execute_command(): starting 15494 1726853352.06095: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853351.91013-16403-74502328419672/AnsiballZ_package_facts.py && sleep 0' 15494 1726853352.07228: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853352.07232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853352.07234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853352.07236: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853352.07238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853352.07389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853352.07402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853352.07588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853352.51796: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 15494 1726853352.51834: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null,<<< 15494 1726853352.51862: stdout chunk (state=3): >>> "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 15494 1726853352.51907: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 15494 1726853352.51939: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 15494 1726853352.51958: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 15494 1726853352.52015: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 15494 1726853352.52021: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15494 1726853352.53824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853352.53859: stderr chunk (state=3): >>><<< 15494 1726853352.53862: stdout chunk (state=3): >>><<< 15494 1726853352.54085: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853352.55756: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853351.91013-16403-74502328419672/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853352.55791: _low_level_execute_command(): starting 15494 1726853352.55805: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853351.91013-16403-74502328419672/ > /dev/null 2>&1 && sleep 0' 15494 1726853352.56600: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853352.56629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853352.56681: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853352.56701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853352.56746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853352.58627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853352.58674: stderr chunk (state=3): >>><<< 15494 1726853352.58678: stdout chunk (state=3): >>><<< 15494 1726853352.58696: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853352.58699: handler run complete 15494 1726853352.59364: variable 'ansible_facts' from source: unknown 15494 1726853352.59703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853352.61558: variable 'ansible_facts' from source: unknown 15494 1726853352.62176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853352.62704: attempt loop complete, returning result 15494 1726853352.62723: _execute() done 15494 1726853352.62731: dumping result to json 15494 1726853352.62947: done dumping result, returning 15494 1726853352.62962: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-0028-1a50-0000000002fc] 15494 1726853352.62972: sending task result for task 02083763-bbaf-0028-1a50-0000000002fc 15494 1726853352.65494: done sending task result for task 02083763-bbaf-0028-1a50-0000000002fc 15494 1726853352.65500: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15494 1726853352.65635: no more pending results, returning what we have 15494 1726853352.65638: results queue empty 15494 1726853352.65639: checking for any_errors_fatal 15494 1726853352.65644: done checking for any_errors_fatal 15494 1726853352.65645: checking for max_fail_percentage 15494 1726853352.65646: done checking for max_fail_percentage 15494 1726853352.65647: checking to see if all hosts have failed and the running result is not ok 15494 1726853352.65648: done checking to see if all hosts have failed 15494 1726853352.65648: getting the remaining hosts for this loop 15494 1726853352.65650: done getting the remaining hosts for this loop 15494 1726853352.65653: getting the next task for host managed_node1 15494 1726853352.65659: done getting next task for host managed_node1 15494 1726853352.65662: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15494 1726853352.65663: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853352.65676: getting variables 15494 1726853352.65678: in VariableManager get_vars() 15494 1726853352.65706: Calling all_inventory to load vars for managed_node1 15494 1726853352.65709: Calling groups_inventory to load vars for managed_node1 15494 1726853352.65711: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853352.65719: Calling all_plugins_play to load vars for managed_node1 15494 1726853352.65722: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853352.65725: Calling groups_plugins_play to load vars for managed_node1 15494 1726853352.66900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853352.69162: done with get_vars() 15494 1726853352.69190: done getting variables 15494 1726853352.69251: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:29:12 -0400 (0:00:00.841) 0:00:21.308 ****** 15494 1726853352.69284: entering _queue_task() for managed_node1/debug 15494 1726853352.69619: worker is 1 (out of 1 available) 15494 1726853352.69631: exiting _queue_task() for managed_node1/debug 15494 1726853352.69645: done queuing things up, now waiting for results queue to drain 15494 1726853352.69647: waiting for pending results... 15494 1726853352.70002: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 15494 1726853352.70081: in run() - task 02083763-bbaf-0028-1a50-00000000003b 15494 1726853352.70108: variable 'ansible_search_path' from source: unknown 15494 1726853352.70115: variable 'ansible_search_path' from source: unknown 15494 1726853352.70158: calling self._execute() 15494 1726853352.70476: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853352.70480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853352.70483: variable 'omit' from source: magic vars 15494 1726853352.70673: variable 'ansible_distribution_major_version' from source: facts 15494 1726853352.70690: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853352.70703: variable 'omit' from source: magic vars 15494 1726853352.70884: variable 'omit' from source: magic vars 15494 1726853352.70939: variable 'network_provider' from source: set_fact 15494 1726853352.70960: variable 'omit' from source: magic vars 15494 1726853352.71006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853352.71051: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853352.71079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853352.71101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853352.71118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853352.71159: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853352.71168: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853352.71178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853352.71276: Set connection var ansible_connection to ssh 15494 1726853352.71361: Set connection var ansible_pipelining to False 15494 1726853352.71364: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853352.71366: Set connection var ansible_shell_type to sh 15494 1726853352.71368: Set connection var ansible_timeout to 10 15494 1726853352.71372: Set connection var ansible_shell_executable to /bin/sh 15494 1726853352.71374: variable 'ansible_shell_executable' from source: unknown 15494 1726853352.71377: variable 'ansible_connection' from source: unknown 15494 1726853352.71379: variable 'ansible_module_compression' from source: unknown 15494 1726853352.71381: variable 'ansible_shell_type' from source: unknown 15494 1726853352.71382: variable 'ansible_shell_executable' from source: unknown 15494 1726853352.71385: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853352.71386: variable 'ansible_pipelining' from source: unknown 15494 1726853352.71388: variable 'ansible_timeout' from source: unknown 15494 1726853352.71390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853352.71514: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853352.71529: variable 'omit' from source: magic vars 15494 1726853352.71538: starting attempt loop 15494 1726853352.71544: running the handler 15494 1726853352.71596: handler run complete 15494 1726853352.71614: attempt loop complete, returning result 15494 1726853352.71620: _execute() done 15494 1726853352.71626: dumping result to json 15494 1726853352.71633: done dumping result, returning 15494 1726853352.71644: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-0028-1a50-00000000003b] 15494 1726853352.71677: sending task result for task 02083763-bbaf-0028-1a50-00000000003b ok: [managed_node1] => {} MSG: Using network provider: nm 15494 1726853352.71848: no more pending results, returning what we have 15494 1726853352.71852: results queue empty 15494 1726853352.71853: checking for any_errors_fatal 15494 1726853352.71864: done checking for any_errors_fatal 15494 1726853352.71864: checking for max_fail_percentage 15494 1726853352.71866: done checking for max_fail_percentage 15494 1726853352.71867: checking to see if all hosts have failed and the running result is not ok 15494 1726853352.71868: done checking to see if all hosts have failed 15494 1726853352.71869: getting the remaining hosts for this loop 15494 1726853352.71872: done getting the remaining hosts for this loop 15494 1726853352.71876: getting the next task for host managed_node1 15494 1726853352.71883: done getting next task for host managed_node1 15494 1726853352.71887: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15494 1726853352.71889: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853352.71899: getting variables 15494 1726853352.71901: in VariableManager get_vars() 15494 1726853352.71939: Calling all_inventory to load vars for managed_node1 15494 1726853352.71941: Calling groups_inventory to load vars for managed_node1 15494 1726853352.71944: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853352.71954: Calling all_plugins_play to load vars for managed_node1 15494 1726853352.71957: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853352.71960: Calling groups_plugins_play to load vars for managed_node1 15494 1726853352.73023: done sending task result for task 02083763-bbaf-0028-1a50-00000000003b 15494 1726853352.73028: WORKER PROCESS EXITING 15494 1726853352.73802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853352.75439: done with get_vars() 15494 1726853352.75466: done getting variables 15494 1726853352.75524: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:29:12 -0400 (0:00:00.062) 0:00:21.371 ****** 15494 1726853352.75559: entering _queue_task() for managed_node1/fail 15494 1726853352.75887: worker is 1 (out of 1 available) 15494 1726853352.75898: exiting _queue_task() for managed_node1/fail 15494 1726853352.75910: done queuing things up, now waiting for results queue to drain 15494 1726853352.75911: waiting for pending results... 15494 1726853352.76185: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15494 1726853352.76301: in run() - task 02083763-bbaf-0028-1a50-00000000003c 15494 1726853352.76321: variable 'ansible_search_path' from source: unknown 15494 1726853352.76417: variable 'ansible_search_path' from source: unknown 15494 1726853352.76420: calling self._execute() 15494 1726853352.76473: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853352.76484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853352.76498: variable 'omit' from source: magic vars 15494 1726853352.76886: variable 'ansible_distribution_major_version' from source: facts 15494 1726853352.76902: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853352.77029: variable 'network_state' from source: role '' defaults 15494 1726853352.77046: Evaluated conditional (network_state != {}): False 15494 1726853352.77055: when evaluation is False, skipping this task 15494 1726853352.77062: _execute() done 15494 1726853352.77074: dumping result to json 15494 1726853352.77082: done dumping result, returning 15494 1726853352.77094: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-0028-1a50-00000000003c] 15494 1726853352.77104: sending task result for task 02083763-bbaf-0028-1a50-00000000003c skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15494 1726853352.77241: no more pending results, returning what we have 15494 1726853352.77246: results queue empty 15494 1726853352.77250: checking for any_errors_fatal 15494 1726853352.77261: done checking for any_errors_fatal 15494 1726853352.77262: checking for max_fail_percentage 15494 1726853352.77263: done checking for max_fail_percentage 15494 1726853352.77264: checking to see if all hosts have failed and the running result is not ok 15494 1726853352.77265: done checking to see if all hosts have failed 15494 1726853352.77266: getting the remaining hosts for this loop 15494 1726853352.77268: done getting the remaining hosts for this loop 15494 1726853352.77273: getting the next task for host managed_node1 15494 1726853352.77280: done getting next task for host managed_node1 15494 1726853352.77284: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15494 1726853352.77286: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853352.77302: getting variables 15494 1726853352.77304: in VariableManager get_vars() 15494 1726853352.77339: Calling all_inventory to load vars for managed_node1 15494 1726853352.77342: Calling groups_inventory to load vars for managed_node1 15494 1726853352.77345: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853352.77359: Calling all_plugins_play to load vars for managed_node1 15494 1726853352.77362: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853352.77365: Calling groups_plugins_play to load vars for managed_node1 15494 1726853352.77663: done sending task result for task 02083763-bbaf-0028-1a50-00000000003c 15494 1726853352.77666: WORKER PROCESS EXITING 15494 1726853352.78939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853352.81996: done with get_vars() 15494 1726853352.82024: done getting variables 15494 1726853352.82300: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:29:12 -0400 (0:00:00.067) 0:00:21.439 ****** 15494 1726853352.82334: entering _queue_task() for managed_node1/fail 15494 1726853352.82916: worker is 1 (out of 1 available) 15494 1726853352.82927: exiting _queue_task() for managed_node1/fail 15494 1726853352.82939: done queuing things up, now waiting for results queue to drain 15494 1726853352.82940: waiting for pending results... 15494 1726853352.83301: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15494 1726853352.83607: in run() - task 02083763-bbaf-0028-1a50-00000000003d 15494 1726853352.83684: variable 'ansible_search_path' from source: unknown 15494 1726853352.83688: variable 'ansible_search_path' from source: unknown 15494 1726853352.83728: calling self._execute() 15494 1726853352.84062: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853352.84068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853352.84080: variable 'omit' from source: magic vars 15494 1726853352.84837: variable 'ansible_distribution_major_version' from source: facts 15494 1726853352.84878: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853352.85082: variable 'network_state' from source: role '' defaults 15494 1726853352.85092: Evaluated conditional (network_state != {}): False 15494 1726853352.85096: when evaluation is False, skipping this task 15494 1726853352.85100: _execute() done 15494 1726853352.85103: dumping result to json 15494 1726853352.85105: done dumping result, returning 15494 1726853352.85140: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-0028-1a50-00000000003d] 15494 1726853352.85144: sending task result for task 02083763-bbaf-0028-1a50-00000000003d 15494 1726853352.85413: done sending task result for task 02083763-bbaf-0028-1a50-00000000003d 15494 1726853352.85416: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15494 1726853352.85474: no more pending results, returning what we have 15494 1726853352.85478: results queue empty 15494 1726853352.85479: checking for any_errors_fatal 15494 1726853352.85485: done checking for any_errors_fatal 15494 1726853352.85486: checking for max_fail_percentage 15494 1726853352.85488: done checking for max_fail_percentage 15494 1726853352.85489: checking to see if all hosts have failed and the running result is not ok 15494 1726853352.85489: done checking to see if all hosts have failed 15494 1726853352.85490: getting the remaining hosts for this loop 15494 1726853352.85492: done getting the remaining hosts for this loop 15494 1726853352.85495: getting the next task for host managed_node1 15494 1726853352.85501: done getting next task for host managed_node1 15494 1726853352.85505: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15494 1726853352.85507: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853352.85521: getting variables 15494 1726853352.85522: in VariableManager get_vars() 15494 1726853352.85560: Calling all_inventory to load vars for managed_node1 15494 1726853352.85563: Calling groups_inventory to load vars for managed_node1 15494 1726853352.85565: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853352.85679: Calling all_plugins_play to load vars for managed_node1 15494 1726853352.85683: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853352.85687: Calling groups_plugins_play to load vars for managed_node1 15494 1726853352.88599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853352.91008: done with get_vars() 15494 1726853352.91043: done getting variables 15494 1726853352.91112: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:29:12 -0400 (0:00:00.088) 0:00:21.527 ****** 15494 1726853352.91149: entering _queue_task() for managed_node1/fail 15494 1726853352.91497: worker is 1 (out of 1 available) 15494 1726853352.91509: exiting _queue_task() for managed_node1/fail 15494 1726853352.91523: done queuing things up, now waiting for results queue to drain 15494 1726853352.91524: waiting for pending results... 15494 1726853352.91899: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15494 1726853352.91955: in run() - task 02083763-bbaf-0028-1a50-00000000003e 15494 1726853352.91982: variable 'ansible_search_path' from source: unknown 15494 1726853352.91995: variable 'ansible_search_path' from source: unknown 15494 1726853352.92105: calling self._execute() 15494 1726853352.92155: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853352.92168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853352.92187: variable 'omit' from source: magic vars 15494 1726853352.92886: variable 'ansible_distribution_major_version' from source: facts 15494 1726853352.92890: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853352.93153: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853352.95746: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853352.95820: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853352.95867: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853352.95934: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853352.95975: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853352.96278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853352.96281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853352.96403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853352.96443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853352.96462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853352.96711: variable 'ansible_distribution_major_version' from source: facts 15494 1726853352.96732: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15494 1726853352.96906: variable 'ansible_distribution' from source: facts 15494 1726853352.97040: variable '__network_rh_distros' from source: role '' defaults 15494 1726853352.97056: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15494 1726853352.97491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853352.97522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853352.97556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853352.97607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853352.97627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853352.97689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853352.97717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853352.97744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853352.97796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853352.97815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853352.97862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853352.97891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853352.97924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853352.97968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853352.97990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853352.98343: variable 'network_connections' from source: play vars 15494 1726853352.98362: variable 'profile' from source: play vars 15494 1726853352.98462: variable 'profile' from source: play vars 15494 1726853352.98474: variable 'interface' from source: set_fact 15494 1726853352.98537: variable 'interface' from source: set_fact 15494 1726853352.98559: variable 'network_state' from source: role '' defaults 15494 1726853352.98632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853352.98887: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853352.98891: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853352.98922: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853352.98979: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853352.99032: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853352.99068: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853352.99104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853352.99133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853352.99176: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15494 1726853352.99179: when evaluation is False, skipping this task 15494 1726853352.99181: _execute() done 15494 1726853352.99183: dumping result to json 15494 1726853352.99185: done dumping result, returning 15494 1726853352.99213: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-0028-1a50-00000000003e] 15494 1726853352.99216: sending task result for task 02083763-bbaf-0028-1a50-00000000003e skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15494 1726853352.99517: no more pending results, returning what we have 15494 1726853352.99521: results queue empty 15494 1726853352.99522: checking for any_errors_fatal 15494 1726853352.99529: done checking for any_errors_fatal 15494 1726853352.99530: checking for max_fail_percentage 15494 1726853352.99532: done checking for max_fail_percentage 15494 1726853352.99533: checking to see if all hosts have failed and the running result is not ok 15494 1726853352.99534: done checking to see if all hosts have failed 15494 1726853352.99534: getting the remaining hosts for this loop 15494 1726853352.99536: done getting the remaining hosts for this loop 15494 1726853352.99540: getting the next task for host managed_node1 15494 1726853352.99550: done getting next task for host managed_node1 15494 1726853352.99553: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15494 1726853352.99555: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853352.99568: getting variables 15494 1726853352.99570: in VariableManager get_vars() 15494 1726853352.99609: Calling all_inventory to load vars for managed_node1 15494 1726853352.99612: Calling groups_inventory to load vars for managed_node1 15494 1726853352.99614: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853352.99625: Calling all_plugins_play to load vars for managed_node1 15494 1726853352.99627: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853352.99630: Calling groups_plugins_play to load vars for managed_node1 15494 1726853353.00187: done sending task result for task 02083763-bbaf-0028-1a50-00000000003e 15494 1726853353.00190: WORKER PROCESS EXITING 15494 1726853353.01693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853353.04823: done with get_vars() 15494 1726853353.04848: done getting variables 15494 1726853353.04910: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:29:13 -0400 (0:00:00.137) 0:00:21.665 ****** 15494 1726853353.04943: entering _queue_task() for managed_node1/dnf 15494 1726853353.05498: worker is 1 (out of 1 available) 15494 1726853353.05511: exiting _queue_task() for managed_node1/dnf 15494 1726853353.05523: done queuing things up, now waiting for results queue to drain 15494 1726853353.05525: waiting for pending results... 15494 1726853353.05796: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15494 1726853353.05904: in run() - task 02083763-bbaf-0028-1a50-00000000003f 15494 1726853353.05922: variable 'ansible_search_path' from source: unknown 15494 1726853353.05929: variable 'ansible_search_path' from source: unknown 15494 1726853353.05968: calling self._execute() 15494 1726853353.06084: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853353.06098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853353.06117: variable 'omit' from source: magic vars 15494 1726853353.06540: variable 'ansible_distribution_major_version' from source: facts 15494 1726853353.06648: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853353.06763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853353.09749: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853353.09820: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853353.09865: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853353.09906: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853353.09942: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853353.10025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.10061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.10092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.10135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.10153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.10281: variable 'ansible_distribution' from source: facts 15494 1726853353.10291: variable 'ansible_distribution_major_version' from source: facts 15494 1726853353.10310: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15494 1726853353.10430: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853353.10563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.10592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.10777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.10780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.10783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.10785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.10787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.10789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.10805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.10822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.10863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.10891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.10923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.10963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.10982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.11146: variable 'network_connections' from source: play vars 15494 1726853353.11162: variable 'profile' from source: play vars 15494 1726853353.11231: variable 'profile' from source: play vars 15494 1726853353.11240: variable 'interface' from source: set_fact 15494 1726853353.11348: variable 'interface' from source: set_fact 15494 1726853353.11442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853353.11765: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853353.11843: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853353.11918: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853353.11922: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853353.12044: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853353.12102: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853353.12174: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.12231: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853353.12378: variable '__network_team_connections_defined' from source: role '' defaults 15494 1726853353.12527: variable 'network_connections' from source: play vars 15494 1726853353.12537: variable 'profile' from source: play vars 15494 1726853353.12625: variable 'profile' from source: play vars 15494 1726853353.12652: variable 'interface' from source: set_fact 15494 1726853353.12748: variable 'interface' from source: set_fact 15494 1726853353.12832: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15494 1726853353.12844: when evaluation is False, skipping this task 15494 1726853353.12862: _execute() done 15494 1726853353.12870: dumping result to json 15494 1726853353.12890: done dumping result, returning 15494 1726853353.12920: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-0028-1a50-00000000003f] 15494 1726853353.12932: sending task result for task 02083763-bbaf-0028-1a50-00000000003f skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15494 1726853353.13157: no more pending results, returning what we have 15494 1726853353.13162: results queue empty 15494 1726853353.13163: checking for any_errors_fatal 15494 1726853353.13169: done checking for any_errors_fatal 15494 1726853353.13170: checking for max_fail_percentage 15494 1726853353.13173: done checking for max_fail_percentage 15494 1726853353.13174: checking to see if all hosts have failed and the running result is not ok 15494 1726853353.13175: done checking to see if all hosts have failed 15494 1726853353.13175: getting the remaining hosts for this loop 15494 1726853353.13177: done getting the remaining hosts for this loop 15494 1726853353.13182: getting the next task for host managed_node1 15494 1726853353.13189: done getting next task for host managed_node1 15494 1726853353.13193: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15494 1726853353.13195: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853353.13210: getting variables 15494 1726853353.13213: in VariableManager get_vars() 15494 1726853353.13266: Calling all_inventory to load vars for managed_node1 15494 1726853353.13269: Calling groups_inventory to load vars for managed_node1 15494 1726853353.13508: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853353.13536: Calling all_plugins_play to load vars for managed_node1 15494 1726853353.13539: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853353.13544: done sending task result for task 02083763-bbaf-0028-1a50-00000000003f 15494 1726853353.13549: WORKER PROCESS EXITING 15494 1726853353.13560: Calling groups_plugins_play to load vars for managed_node1 15494 1726853353.16209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853353.18031: done with get_vars() 15494 1726853353.18063: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15494 1726853353.18142: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:29:13 -0400 (0:00:00.132) 0:00:21.797 ****** 15494 1726853353.18176: entering _queue_task() for managed_node1/yum 15494 1726853353.18551: worker is 1 (out of 1 available) 15494 1726853353.18562: exiting _queue_task() for managed_node1/yum 15494 1726853353.18579: done queuing things up, now waiting for results queue to drain 15494 1726853353.18580: waiting for pending results... 15494 1726853353.18847: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15494 1726853353.18967: in run() - task 02083763-bbaf-0028-1a50-000000000040 15494 1726853353.19045: variable 'ansible_search_path' from source: unknown 15494 1726853353.19048: variable 'ansible_search_path' from source: unknown 15494 1726853353.19051: calling self._execute() 15494 1726853353.19133: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853353.19144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853353.19165: variable 'omit' from source: magic vars 15494 1726853353.19572: variable 'ansible_distribution_major_version' from source: facts 15494 1726853353.19598: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853353.19804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853353.22311: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853353.22387: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853353.22429: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853353.22466: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853353.22493: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853353.22584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.22615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.22646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.22693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.22709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.22814: variable 'ansible_distribution_major_version' from source: facts 15494 1726853353.22837: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15494 1726853353.22840: when evaluation is False, skipping this task 15494 1726853353.22844: _execute() done 15494 1726853353.22847: dumping result to json 15494 1726853353.22852: done dumping result, returning 15494 1726853353.22862: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-0028-1a50-000000000040] 15494 1726853353.22867: sending task result for task 02083763-bbaf-0028-1a50-000000000040 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15494 1726853353.23274: no more pending results, returning what we have 15494 1726853353.23278: results queue empty 15494 1726853353.23279: checking for any_errors_fatal 15494 1726853353.23284: done checking for any_errors_fatal 15494 1726853353.23285: checking for max_fail_percentage 15494 1726853353.23287: done checking for max_fail_percentage 15494 1726853353.23287: checking to see if all hosts have failed and the running result is not ok 15494 1726853353.23288: done checking to see if all hosts have failed 15494 1726853353.23289: getting the remaining hosts for this loop 15494 1726853353.23290: done getting the remaining hosts for this loop 15494 1726853353.23293: getting the next task for host managed_node1 15494 1726853353.23299: done getting next task for host managed_node1 15494 1726853353.23303: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15494 1726853353.23305: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853353.23319: getting variables 15494 1726853353.23320: in VariableManager get_vars() 15494 1726853353.23354: Calling all_inventory to load vars for managed_node1 15494 1726853353.23357: Calling groups_inventory to load vars for managed_node1 15494 1726853353.23359: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853353.23368: Calling all_plugins_play to load vars for managed_node1 15494 1726853353.23372: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853353.23376: Calling groups_plugins_play to load vars for managed_node1 15494 1726853353.24000: done sending task result for task 02083763-bbaf-0028-1a50-000000000040 15494 1726853353.24003: WORKER PROCESS EXITING 15494 1726853353.24961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853353.26596: done with get_vars() 15494 1726853353.26631: done getting variables 15494 1726853353.26699: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:29:13 -0400 (0:00:00.085) 0:00:21.883 ****** 15494 1726853353.26743: entering _queue_task() for managed_node1/fail 15494 1726853353.27128: worker is 1 (out of 1 available) 15494 1726853353.27141: exiting _queue_task() for managed_node1/fail 15494 1726853353.27157: done queuing things up, now waiting for results queue to drain 15494 1726853353.27158: waiting for pending results... 15494 1726853353.27478: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15494 1726853353.27614: in run() - task 02083763-bbaf-0028-1a50-000000000041 15494 1726853353.27636: variable 'ansible_search_path' from source: unknown 15494 1726853353.27646: variable 'ansible_search_path' from source: unknown 15494 1726853353.27693: calling self._execute() 15494 1726853353.27812: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853353.27832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853353.27854: variable 'omit' from source: magic vars 15494 1726853353.28370: variable 'ansible_distribution_major_version' from source: facts 15494 1726853353.28375: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853353.28411: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853353.28641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853353.31034: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853353.31112: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853353.31160: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853353.31265: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853353.31268: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853353.31339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.31413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.31426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.31477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.31502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.31602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.31606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.31635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.31686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.31715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.31767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.31801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.31845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.31928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.31932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.32117: variable 'network_connections' from source: play vars 15494 1726853353.32134: variable 'profile' from source: play vars 15494 1726853353.32217: variable 'profile' from source: play vars 15494 1726853353.32234: variable 'interface' from source: set_fact 15494 1726853353.32308: variable 'interface' from source: set_fact 15494 1726853353.32401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853353.32675: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853353.32683: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853353.32703: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853353.32741: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853353.32797: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853353.32832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853353.32869: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.32915: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853353.32963: variable '__network_team_connections_defined' from source: role '' defaults 15494 1726853353.33252: variable 'network_connections' from source: play vars 15494 1726853353.33276: variable 'profile' from source: play vars 15494 1726853353.33327: variable 'profile' from source: play vars 15494 1726853353.33356: variable 'interface' from source: set_fact 15494 1726853353.33415: variable 'interface' from source: set_fact 15494 1726853353.33462: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15494 1726853353.33468: when evaluation is False, skipping this task 15494 1726853353.33476: _execute() done 15494 1726853353.33479: dumping result to json 15494 1726853353.33569: done dumping result, returning 15494 1726853353.33574: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-0028-1a50-000000000041] 15494 1726853353.33585: sending task result for task 02083763-bbaf-0028-1a50-000000000041 15494 1726853353.33660: done sending task result for task 02083763-bbaf-0028-1a50-000000000041 15494 1726853353.33664: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15494 1726853353.33719: no more pending results, returning what we have 15494 1726853353.33724: results queue empty 15494 1726853353.33725: checking for any_errors_fatal 15494 1726853353.33731: done checking for any_errors_fatal 15494 1726853353.33733: checking for max_fail_percentage 15494 1726853353.33735: done checking for max_fail_percentage 15494 1726853353.33736: checking to see if all hosts have failed and the running result is not ok 15494 1726853353.33737: done checking to see if all hosts have failed 15494 1726853353.33737: getting the remaining hosts for this loop 15494 1726853353.33739: done getting the remaining hosts for this loop 15494 1726853353.33748: getting the next task for host managed_node1 15494 1726853353.33755: done getting next task for host managed_node1 15494 1726853353.33760: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15494 1726853353.33762: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853353.33778: getting variables 15494 1726853353.33780: in VariableManager get_vars() 15494 1726853353.33827: Calling all_inventory to load vars for managed_node1 15494 1726853353.33830: Calling groups_inventory to load vars for managed_node1 15494 1726853353.33832: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853353.33845: Calling all_plugins_play to load vars for managed_node1 15494 1726853353.33849: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853353.33853: Calling groups_plugins_play to load vars for managed_node1 15494 1726853353.35591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853353.37261: done with get_vars() 15494 1726853353.37292: done getting variables 15494 1726853353.37359: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:29:13 -0400 (0:00:00.106) 0:00:21.989 ****** 15494 1726853353.37395: entering _queue_task() for managed_node1/package 15494 1726853353.37754: worker is 1 (out of 1 available) 15494 1726853353.37767: exiting _queue_task() for managed_node1/package 15494 1726853353.37983: done queuing things up, now waiting for results queue to drain 15494 1726853353.37985: waiting for pending results... 15494 1726853353.38206: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 15494 1726853353.38282: in run() - task 02083763-bbaf-0028-1a50-000000000042 15494 1726853353.38289: variable 'ansible_search_path' from source: unknown 15494 1726853353.38293: variable 'ansible_search_path' from source: unknown 15494 1726853353.38322: calling self._execute() 15494 1726853353.38438: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853353.38455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853353.38507: variable 'omit' from source: magic vars 15494 1726853353.38874: variable 'ansible_distribution_major_version' from source: facts 15494 1726853353.38891: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853353.39118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853353.39428: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853353.39495: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853353.39598: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853353.39603: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853353.39702: variable 'network_packages' from source: role '' defaults 15494 1726853353.39830: variable '__network_provider_setup' from source: role '' defaults 15494 1726853353.39848: variable '__network_service_name_default_nm' from source: role '' defaults 15494 1726853353.39936: variable '__network_service_name_default_nm' from source: role '' defaults 15494 1726853353.39951: variable '__network_packages_default_nm' from source: role '' defaults 15494 1726853353.40066: variable '__network_packages_default_nm' from source: role '' defaults 15494 1726853353.40267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853353.50851: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853353.50876: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853353.50958: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853353.50961: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853353.50982: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853353.51047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.51084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.51280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.51298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.51313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.51357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.51607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.51611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.51613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.51615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.52179: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15494 1726853353.52678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.52682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.52684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.52818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.52832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.53112: variable 'ansible_python' from source: facts 15494 1726853353.53115: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15494 1726853353.53355: variable '__network_wpa_supplicant_required' from source: role '' defaults 15494 1726853353.53475: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15494 1726853353.53614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.53638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.53688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.53727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.53741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.53796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.53818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.53841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.53886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.53900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.54043: variable 'network_connections' from source: play vars 15494 1726853353.54048: variable 'profile' from source: play vars 15494 1726853353.54155: variable 'profile' from source: play vars 15494 1726853353.54162: variable 'interface' from source: set_fact 15494 1726853353.54235: variable 'interface' from source: set_fact 15494 1726853353.54309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853353.54334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853353.54366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.54402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853353.54442: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853353.54734: variable 'network_connections' from source: play vars 15494 1726853353.54740: variable 'profile' from source: play vars 15494 1726853353.54840: variable 'profile' from source: play vars 15494 1726853353.54854: variable 'interface' from source: set_fact 15494 1726853353.54924: variable 'interface' from source: set_fact 15494 1726853353.54962: variable '__network_packages_default_wireless' from source: role '' defaults 15494 1726853353.55041: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853353.55793: variable 'network_connections' from source: play vars 15494 1726853353.55796: variable 'profile' from source: play vars 15494 1726853353.55929: variable 'profile' from source: play vars 15494 1726853353.55932: variable 'interface' from source: set_fact 15494 1726853353.56074: variable 'interface' from source: set_fact 15494 1726853353.56215: variable '__network_packages_default_team' from source: role '' defaults 15494 1726853353.56441: variable '__network_team_connections_defined' from source: role '' defaults 15494 1726853353.56980: variable 'network_connections' from source: play vars 15494 1726853353.56986: variable 'profile' from source: play vars 15494 1726853353.57049: variable 'profile' from source: play vars 15494 1726853353.57056: variable 'interface' from source: set_fact 15494 1726853353.57305: variable 'interface' from source: set_fact 15494 1726853353.57408: variable '__network_service_name_default_initscripts' from source: role '' defaults 15494 1726853353.57469: variable '__network_service_name_default_initscripts' from source: role '' defaults 15494 1726853353.57522: variable '__network_packages_default_initscripts' from source: role '' defaults 15494 1726853353.57682: variable '__network_packages_default_initscripts' from source: role '' defaults 15494 1726853353.58046: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15494 1726853353.59278: variable 'network_connections' from source: play vars 15494 1726853353.59281: variable 'profile' from source: play vars 15494 1726853353.59283: variable 'profile' from source: play vars 15494 1726853353.59286: variable 'interface' from source: set_fact 15494 1726853353.59377: variable 'interface' from source: set_fact 15494 1726853353.59382: variable 'ansible_distribution' from source: facts 15494 1726853353.59385: variable '__network_rh_distros' from source: role '' defaults 15494 1726853353.59387: variable 'ansible_distribution_major_version' from source: facts 15494 1726853353.59390: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15494 1726853353.59559: variable 'ansible_distribution' from source: facts 15494 1726853353.59563: variable '__network_rh_distros' from source: role '' defaults 15494 1726853353.59568: variable 'ansible_distribution_major_version' from source: facts 15494 1726853353.59582: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15494 1726853353.59744: variable 'ansible_distribution' from source: facts 15494 1726853353.59748: variable '__network_rh_distros' from source: role '' defaults 15494 1726853353.59756: variable 'ansible_distribution_major_version' from source: facts 15494 1726853353.59797: variable 'network_provider' from source: set_fact 15494 1726853353.59811: variable 'ansible_facts' from source: unknown 15494 1726853353.60478: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15494 1726853353.60482: when evaluation is False, skipping this task 15494 1726853353.60484: _execute() done 15494 1726853353.60487: dumping result to json 15494 1726853353.60489: done dumping result, returning 15494 1726853353.60495: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-0028-1a50-000000000042] 15494 1726853353.60498: sending task result for task 02083763-bbaf-0028-1a50-000000000042 skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15494 1726853353.60637: no more pending results, returning what we have 15494 1726853353.60641: results queue empty 15494 1726853353.60642: checking for any_errors_fatal 15494 1726853353.60648: done checking for any_errors_fatal 15494 1726853353.60649: checking for max_fail_percentage 15494 1726853353.60651: done checking for max_fail_percentage 15494 1726853353.60651: checking to see if all hosts have failed and the running result is not ok 15494 1726853353.60652: done checking to see if all hosts have failed 15494 1726853353.60653: getting the remaining hosts for this loop 15494 1726853353.60654: done getting the remaining hosts for this loop 15494 1726853353.60657: getting the next task for host managed_node1 15494 1726853353.60663: done getting next task for host managed_node1 15494 1726853353.60666: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15494 1726853353.60668: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853353.60682: getting variables 15494 1726853353.60684: in VariableManager get_vars() 15494 1726853353.60717: Calling all_inventory to load vars for managed_node1 15494 1726853353.60719: Calling groups_inventory to load vars for managed_node1 15494 1726853353.60721: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853353.60730: Calling all_plugins_play to load vars for managed_node1 15494 1726853353.60737: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853353.60740: Calling groups_plugins_play to load vars for managed_node1 15494 1726853353.61290: done sending task result for task 02083763-bbaf-0028-1a50-000000000042 15494 1726853353.61294: WORKER PROCESS EXITING 15494 1726853353.67630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853353.69210: done with get_vars() 15494 1726853353.69241: done getting variables 15494 1726853353.69299: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:29:13 -0400 (0:00:00.319) 0:00:22.309 ****** 15494 1726853353.69326: entering _queue_task() for managed_node1/package 15494 1726853353.69693: worker is 1 (out of 1 available) 15494 1726853353.69708: exiting _queue_task() for managed_node1/package 15494 1726853353.69720: done queuing things up, now waiting for results queue to drain 15494 1726853353.69722: waiting for pending results... 15494 1726853353.70010: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15494 1726853353.70140: in run() - task 02083763-bbaf-0028-1a50-000000000043 15494 1726853353.70161: variable 'ansible_search_path' from source: unknown 15494 1726853353.70168: variable 'ansible_search_path' from source: unknown 15494 1726853353.70214: calling self._execute() 15494 1726853353.70321: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853353.70332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853353.70345: variable 'omit' from source: magic vars 15494 1726853353.70748: variable 'ansible_distribution_major_version' from source: facts 15494 1726853353.70844: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853353.70892: variable 'network_state' from source: role '' defaults 15494 1726853353.70906: Evaluated conditional (network_state != {}): False 15494 1726853353.70912: when evaluation is False, skipping this task 15494 1726853353.70919: _execute() done 15494 1726853353.70925: dumping result to json 15494 1726853353.70933: done dumping result, returning 15494 1726853353.70943: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-0028-1a50-000000000043] 15494 1726853353.70954: sending task result for task 02083763-bbaf-0028-1a50-000000000043 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15494 1726853353.71114: no more pending results, returning what we have 15494 1726853353.71119: results queue empty 15494 1726853353.71120: checking for any_errors_fatal 15494 1726853353.71130: done checking for any_errors_fatal 15494 1726853353.71130: checking for max_fail_percentage 15494 1726853353.71132: done checking for max_fail_percentage 15494 1726853353.71133: checking to see if all hosts have failed and the running result is not ok 15494 1726853353.71134: done checking to see if all hosts have failed 15494 1726853353.71135: getting the remaining hosts for this loop 15494 1726853353.71137: done getting the remaining hosts for this loop 15494 1726853353.71140: getting the next task for host managed_node1 15494 1726853353.71148: done getting next task for host managed_node1 15494 1726853353.71151: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15494 1726853353.71153: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853353.71168: getting variables 15494 1726853353.71172: in VariableManager get_vars() 15494 1726853353.71209: Calling all_inventory to load vars for managed_node1 15494 1726853353.71212: Calling groups_inventory to load vars for managed_node1 15494 1726853353.71214: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853353.71226: Calling all_plugins_play to load vars for managed_node1 15494 1726853353.71229: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853353.71232: Calling groups_plugins_play to load vars for managed_node1 15494 1726853353.72213: done sending task result for task 02083763-bbaf-0028-1a50-000000000043 15494 1726853353.72217: WORKER PROCESS EXITING 15494 1726853353.72957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853353.74520: done with get_vars() 15494 1726853353.74546: done getting variables 15494 1726853353.74608: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:29:13 -0400 (0:00:00.053) 0:00:22.362 ****** 15494 1726853353.74644: entering _queue_task() for managed_node1/package 15494 1726853353.74999: worker is 1 (out of 1 available) 15494 1726853353.75013: exiting _queue_task() for managed_node1/package 15494 1726853353.75028: done queuing things up, now waiting for results queue to drain 15494 1726853353.75030: waiting for pending results... 15494 1726853353.75490: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15494 1726853353.75495: in run() - task 02083763-bbaf-0028-1a50-000000000044 15494 1726853353.75499: variable 'ansible_search_path' from source: unknown 15494 1726853353.75502: variable 'ansible_search_path' from source: unknown 15494 1726853353.75512: calling self._execute() 15494 1726853353.75618: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853353.75630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853353.75646: variable 'omit' from source: magic vars 15494 1726853353.76061: variable 'ansible_distribution_major_version' from source: facts 15494 1726853353.76079: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853353.76202: variable 'network_state' from source: role '' defaults 15494 1726853353.76217: Evaluated conditional (network_state != {}): False 15494 1726853353.76226: when evaluation is False, skipping this task 15494 1726853353.76233: _execute() done 15494 1726853353.76242: dumping result to json 15494 1726853353.76254: done dumping result, returning 15494 1726853353.76272: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-0028-1a50-000000000044] 15494 1726853353.76378: sending task result for task 02083763-bbaf-0028-1a50-000000000044 15494 1726853353.76448: done sending task result for task 02083763-bbaf-0028-1a50-000000000044 15494 1726853353.76451: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15494 1726853353.76528: no more pending results, returning what we have 15494 1726853353.76532: results queue empty 15494 1726853353.76533: checking for any_errors_fatal 15494 1726853353.76540: done checking for any_errors_fatal 15494 1726853353.76541: checking for max_fail_percentage 15494 1726853353.76543: done checking for max_fail_percentage 15494 1726853353.76543: checking to see if all hosts have failed and the running result is not ok 15494 1726853353.76544: done checking to see if all hosts have failed 15494 1726853353.76545: getting the remaining hosts for this loop 15494 1726853353.76547: done getting the remaining hosts for this loop 15494 1726853353.76550: getting the next task for host managed_node1 15494 1726853353.76556: done getting next task for host managed_node1 15494 1726853353.76560: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15494 1726853353.76562: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853353.76581: getting variables 15494 1726853353.76583: in VariableManager get_vars() 15494 1726853353.76620: Calling all_inventory to load vars for managed_node1 15494 1726853353.76622: Calling groups_inventory to load vars for managed_node1 15494 1726853353.76625: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853353.76637: Calling all_plugins_play to load vars for managed_node1 15494 1726853353.76640: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853353.76643: Calling groups_plugins_play to load vars for managed_node1 15494 1726853353.78280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853353.79830: done with get_vars() 15494 1726853353.79860: done getting variables 15494 1726853353.79924: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:29:13 -0400 (0:00:00.053) 0:00:22.415 ****** 15494 1726853353.79958: entering _queue_task() for managed_node1/service 15494 1726853353.80329: worker is 1 (out of 1 available) 15494 1726853353.80341: exiting _queue_task() for managed_node1/service 15494 1726853353.80355: done queuing things up, now waiting for results queue to drain 15494 1726853353.80357: waiting for pending results... 15494 1726853353.80650: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15494 1726853353.80798: in run() - task 02083763-bbaf-0028-1a50-000000000045 15494 1726853353.80976: variable 'ansible_search_path' from source: unknown 15494 1726853353.80979: variable 'ansible_search_path' from source: unknown 15494 1726853353.80982: calling self._execute() 15494 1726853353.80984: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853353.80987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853353.80989: variable 'omit' from source: magic vars 15494 1726853353.81391: variable 'ansible_distribution_major_version' from source: facts 15494 1726853353.81411: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853353.81550: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853353.81760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853353.83981: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853353.84064: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853353.84107: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853353.84148: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853353.84184: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853353.84276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.84309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.84339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.84390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.84411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.84460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.84495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.84525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.84578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.84591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.84686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.84689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.84692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.84731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.84751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.84932: variable 'network_connections' from source: play vars 15494 1726853353.84950: variable 'profile' from source: play vars 15494 1726853353.85028: variable 'profile' from source: play vars 15494 1726853353.85038: variable 'interface' from source: set_fact 15494 1726853353.85100: variable 'interface' from source: set_fact 15494 1726853353.85230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853353.85366: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853353.85404: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853353.85439: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853353.85478: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853353.85556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853353.85559: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853353.85582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.85612: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853353.85669: variable '__network_team_connections_defined' from source: role '' defaults 15494 1726853353.85908: variable 'network_connections' from source: play vars 15494 1726853353.85975: variable 'profile' from source: play vars 15494 1726853353.85979: variable 'profile' from source: play vars 15494 1726853353.85986: variable 'interface' from source: set_fact 15494 1726853353.86047: variable 'interface' from source: set_fact 15494 1726853353.86079: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15494 1726853353.86086: when evaluation is False, skipping this task 15494 1726853353.86098: _execute() done 15494 1726853353.86106: dumping result to json 15494 1726853353.86113: done dumping result, returning 15494 1726853353.86124: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-0028-1a50-000000000045] 15494 1726853353.86203: sending task result for task 02083763-bbaf-0028-1a50-000000000045 15494 1726853353.86280: done sending task result for task 02083763-bbaf-0028-1a50-000000000045 15494 1726853353.86283: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15494 1726853353.86350: no more pending results, returning what we have 15494 1726853353.86355: results queue empty 15494 1726853353.86356: checking for any_errors_fatal 15494 1726853353.86364: done checking for any_errors_fatal 15494 1726853353.86364: checking for max_fail_percentage 15494 1726853353.86366: done checking for max_fail_percentage 15494 1726853353.86367: checking to see if all hosts have failed and the running result is not ok 15494 1726853353.86368: done checking to see if all hosts have failed 15494 1726853353.86368: getting the remaining hosts for this loop 15494 1726853353.86370: done getting the remaining hosts for this loop 15494 1726853353.86376: getting the next task for host managed_node1 15494 1726853353.86382: done getting next task for host managed_node1 15494 1726853353.86386: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15494 1726853353.86389: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853353.86402: getting variables 15494 1726853353.86404: in VariableManager get_vars() 15494 1726853353.86442: Calling all_inventory to load vars for managed_node1 15494 1726853353.86445: Calling groups_inventory to load vars for managed_node1 15494 1726853353.86447: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853353.86458: Calling all_plugins_play to load vars for managed_node1 15494 1726853353.86460: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853353.86463: Calling groups_plugins_play to load vars for managed_node1 15494 1726853353.88022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853353.89739: done with get_vars() 15494 1726853353.89765: done getting variables 15494 1726853353.89837: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:29:13 -0400 (0:00:00.099) 0:00:22.514 ****** 15494 1726853353.89873: entering _queue_task() for managed_node1/service 15494 1726853353.90315: worker is 1 (out of 1 available) 15494 1726853353.90327: exiting _queue_task() for managed_node1/service 15494 1726853353.90337: done queuing things up, now waiting for results queue to drain 15494 1726853353.90339: waiting for pending results... 15494 1726853353.90577: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15494 1726853353.90878: in run() - task 02083763-bbaf-0028-1a50-000000000046 15494 1726853353.90882: variable 'ansible_search_path' from source: unknown 15494 1726853353.90885: variable 'ansible_search_path' from source: unknown 15494 1726853353.90887: calling self._execute() 15494 1726853353.90890: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853353.90892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853353.90896: variable 'omit' from source: magic vars 15494 1726853353.91245: variable 'ansible_distribution_major_version' from source: facts 15494 1726853353.91262: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853353.91422: variable 'network_provider' from source: set_fact 15494 1726853353.91434: variable 'network_state' from source: role '' defaults 15494 1726853353.91451: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15494 1726853353.91461: variable 'omit' from source: magic vars 15494 1726853353.91501: variable 'omit' from source: magic vars 15494 1726853353.91532: variable 'network_service_name' from source: role '' defaults 15494 1726853353.91602: variable 'network_service_name' from source: role '' defaults 15494 1726853353.91713: variable '__network_provider_setup' from source: role '' defaults 15494 1726853353.91724: variable '__network_service_name_default_nm' from source: role '' defaults 15494 1726853353.91793: variable '__network_service_name_default_nm' from source: role '' defaults 15494 1726853353.91806: variable '__network_packages_default_nm' from source: role '' defaults 15494 1726853353.91870: variable '__network_packages_default_nm' from source: role '' defaults 15494 1726853353.92102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853353.94219: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853353.94304: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853353.94347: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853353.94394: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853353.94427: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853353.94519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.94556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.94593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.94697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.94701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.94708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.94738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.94769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.94820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.94840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.95088: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15494 1726853353.95215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.95248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.95348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.95352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.95356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.95461: variable 'ansible_python' from source: facts 15494 1726853353.95492: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15494 1726853353.95583: variable '__network_wpa_supplicant_required' from source: role '' defaults 15494 1726853353.95661: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15494 1726853353.95802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.95832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.95861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.95909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.95928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.95982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853353.96112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853353.96116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.96118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853353.96121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853353.96286: variable 'network_connections' from source: play vars 15494 1726853353.96301: variable 'profile' from source: play vars 15494 1726853353.96387: variable 'profile' from source: play vars 15494 1726853353.96397: variable 'interface' from source: set_fact 15494 1726853353.96455: variable 'interface' from source: set_fact 15494 1726853353.96570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853353.96780: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853353.96838: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853353.96906: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853353.96954: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853353.97020: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853353.97078: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853353.97104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853353.97138: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853353.97301: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853353.97487: variable 'network_connections' from source: play vars 15494 1726853353.97499: variable 'profile' from source: play vars 15494 1726853353.97580: variable 'profile' from source: play vars 15494 1726853353.97594: variable 'interface' from source: set_fact 15494 1726853353.97658: variable 'interface' from source: set_fact 15494 1726853353.97697: variable '__network_packages_default_wireless' from source: role '' defaults 15494 1726853353.97781: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853353.98075: variable 'network_connections' from source: play vars 15494 1726853353.98086: variable 'profile' from source: play vars 15494 1726853353.98155: variable 'profile' from source: play vars 15494 1726853353.98165: variable 'interface' from source: set_fact 15494 1726853353.98242: variable 'interface' from source: set_fact 15494 1726853353.98275: variable '__network_packages_default_team' from source: role '' defaults 15494 1726853353.98360: variable '__network_team_connections_defined' from source: role '' defaults 15494 1726853353.98648: variable 'network_connections' from source: play vars 15494 1726853353.98976: variable 'profile' from source: play vars 15494 1726853353.98979: variable 'profile' from source: play vars 15494 1726853353.98981: variable 'interface' from source: set_fact 15494 1726853353.98983: variable 'interface' from source: set_fact 15494 1726853353.99031: variable '__network_service_name_default_initscripts' from source: role '' defaults 15494 1726853353.99249: variable '__network_service_name_default_initscripts' from source: role '' defaults 15494 1726853353.99261: variable '__network_packages_default_initscripts' from source: role '' defaults 15494 1726853353.99390: variable '__network_packages_default_initscripts' from source: role '' defaults 15494 1726853353.99798: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15494 1726853354.00430: variable 'network_connections' from source: play vars 15494 1726853354.00441: variable 'profile' from source: play vars 15494 1726853354.00504: variable 'profile' from source: play vars 15494 1726853354.00515: variable 'interface' from source: set_fact 15494 1726853354.00587: variable 'interface' from source: set_fact 15494 1726853354.00600: variable 'ansible_distribution' from source: facts 15494 1726853354.00607: variable '__network_rh_distros' from source: role '' defaults 15494 1726853354.00619: variable 'ansible_distribution_major_version' from source: facts 15494 1726853354.00637: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15494 1726853354.00812: variable 'ansible_distribution' from source: facts 15494 1726853354.00836: variable '__network_rh_distros' from source: role '' defaults 15494 1726853354.00839: variable 'ansible_distribution_major_version' from source: facts 15494 1726853354.00848: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15494 1726853354.01019: variable 'ansible_distribution' from source: facts 15494 1726853354.01053: variable '__network_rh_distros' from source: role '' defaults 15494 1726853354.01057: variable 'ansible_distribution_major_version' from source: facts 15494 1726853354.01083: variable 'network_provider' from source: set_fact 15494 1726853354.01112: variable 'omit' from source: magic vars 15494 1726853354.01162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853354.01181: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853354.01206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853354.01272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853354.01276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853354.01279: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853354.01285: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853354.01292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853354.01393: Set connection var ansible_connection to ssh 15494 1726853354.01402: Set connection var ansible_pipelining to False 15494 1726853354.01411: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853354.01416: Set connection var ansible_shell_type to sh 15494 1726853354.01423: Set connection var ansible_timeout to 10 15494 1726853354.01432: Set connection var ansible_shell_executable to /bin/sh 15494 1726853354.01457: variable 'ansible_shell_executable' from source: unknown 15494 1726853354.01484: variable 'ansible_connection' from source: unknown 15494 1726853354.01488: variable 'ansible_module_compression' from source: unknown 15494 1726853354.01489: variable 'ansible_shell_type' from source: unknown 15494 1726853354.01491: variable 'ansible_shell_executable' from source: unknown 15494 1726853354.01492: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853354.01497: variable 'ansible_pipelining' from source: unknown 15494 1726853354.01499: variable 'ansible_timeout' from source: unknown 15494 1726853354.01500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853354.01677: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853354.01681: variable 'omit' from source: magic vars 15494 1726853354.01684: starting attempt loop 15494 1726853354.01686: running the handler 15494 1726853354.01707: variable 'ansible_facts' from source: unknown 15494 1726853354.02555: _low_level_execute_command(): starting 15494 1726853354.02575: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853354.03260: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853354.03277: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853354.03340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853354.03393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853354.03415: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853354.03443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853354.03566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853354.05263: stdout chunk (state=3): >>>/root <<< 15494 1726853354.05699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853354.05702: stdout chunk (state=3): >>><<< 15494 1726853354.05705: stderr chunk (state=3): >>><<< 15494 1726853354.05708: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853354.05710: _low_level_execute_command(): starting 15494 1726853354.05713: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853354.0560386-16479-146653848679451 `" && echo ansible-tmp-1726853354.0560386-16479-146653848679451="` echo /root/.ansible/tmp/ansible-tmp-1726853354.0560386-16479-146653848679451 `" ) && sleep 0' 15494 1726853354.06730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853354.06735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853354.06737: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 15494 1726853354.06745: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853354.06747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853354.07061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853354.09189: stdout chunk (state=3): >>>ansible-tmp-1726853354.0560386-16479-146653848679451=/root/.ansible/tmp/ansible-tmp-1726853354.0560386-16479-146653848679451 <<< 15494 1726853354.09293: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853354.09296: stdout chunk (state=3): >>><<< 15494 1726853354.09299: stderr chunk (state=3): >>><<< 15494 1726853354.09577: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853354.0560386-16479-146653848679451=/root/.ansible/tmp/ansible-tmp-1726853354.0560386-16479-146653848679451 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853354.09581: variable 'ansible_module_compression' from source: unknown 15494 1726853354.09584: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15494 1726853354.09586: variable 'ansible_facts' from source: unknown 15494 1726853354.10166: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853354.0560386-16479-146653848679451/AnsiballZ_systemd.py 15494 1726853354.10597: Sending initial data 15494 1726853354.10600: Sent initial data (156 bytes) 15494 1726853354.11849: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853354.11992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853354.12007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853354.12079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853354.13696: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15494 1726853354.13710: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15494 1726853354.13719: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 15494 1726853354.13803: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853354.13912: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853354.13953: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpytozxxg9 /root/.ansible/tmp/ansible-tmp-1726853354.0560386-16479-146653848679451/AnsiballZ_systemd.py <<< 15494 1726853354.13957: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853354.0560386-16479-146653848679451/AnsiballZ_systemd.py" <<< 15494 1726853354.14023: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpytozxxg9" to remote "/root/.ansible/tmp/ansible-tmp-1726853354.0560386-16479-146653848679451/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853354.0560386-16479-146653848679451/AnsiballZ_systemd.py" <<< 15494 1726853354.16248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853354.16262: stdout chunk (state=3): >>><<< 15494 1726853354.16436: stderr chunk (state=3): >>><<< 15494 1726853354.16451: done transferring module to remote 15494 1726853354.16467: _low_level_execute_command(): starting 15494 1726853354.16555: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853354.0560386-16479-146653848679451/ /root/.ansible/tmp/ansible-tmp-1726853354.0560386-16479-146653848679451/AnsiballZ_systemd.py && sleep 0' 15494 1726853354.17820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853354.17864: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853354.17888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853354.18175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853354.18184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853354.18201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853354.18267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853354.20245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853354.20249: stdout chunk (state=3): >>><<< 15494 1726853354.20258: stderr chunk (state=3): >>><<< 15494 1726853354.20278: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853354.20281: _low_level_execute_command(): starting 15494 1726853354.20285: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853354.0560386-16479-146653848679451/AnsiballZ_systemd.py && sleep 0' 15494 1726853354.21329: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853354.21682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853354.21701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853354.21707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853354.21781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853354.50707: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10649600", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313475584", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "750308000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 15494 1726853354.50743: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15494 1726853354.52677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853354.52682: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 15494 1726853354.52684: stderr chunk (state=3): >>><<< 15494 1726853354.52686: stdout chunk (state=3): >>><<< 15494 1726853354.52689: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10649600", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313475584", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "750308000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853354.52864: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853354.0560386-16479-146653848679451/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853354.52887: _low_level_execute_command(): starting 15494 1726853354.52890: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853354.0560386-16479-146653848679451/ > /dev/null 2>&1 && sleep 0' 15494 1726853354.53899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853354.53912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853354.54144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853354.54161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853354.54365: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853354.54485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853354.56292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853354.56334: stderr chunk (state=3): >>><<< 15494 1726853354.56394: stdout chunk (state=3): >>><<< 15494 1726853354.56800: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853354.56804: handler run complete 15494 1726853354.56806: attempt loop complete, returning result 15494 1726853354.56808: _execute() done 15494 1726853354.56810: dumping result to json 15494 1726853354.56813: done dumping result, returning 15494 1726853354.56815: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-0028-1a50-000000000046] 15494 1726853354.56817: sending task result for task 02083763-bbaf-0028-1a50-000000000046 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15494 1726853354.57428: no more pending results, returning what we have 15494 1726853354.57432: results queue empty 15494 1726853354.57432: checking for any_errors_fatal 15494 1726853354.57439: done checking for any_errors_fatal 15494 1726853354.57440: checking for max_fail_percentage 15494 1726853354.57441: done checking for max_fail_percentage 15494 1726853354.57442: checking to see if all hosts have failed and the running result is not ok 15494 1726853354.57443: done checking to see if all hosts have failed 15494 1726853354.57443: getting the remaining hosts for this loop 15494 1726853354.57445: done getting the remaining hosts for this loop 15494 1726853354.57449: getting the next task for host managed_node1 15494 1726853354.57455: done getting next task for host managed_node1 15494 1726853354.57459: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15494 1726853354.57461: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853354.57473: getting variables 15494 1726853354.57476: in VariableManager get_vars() 15494 1726853354.57515: Calling all_inventory to load vars for managed_node1 15494 1726853354.57519: Calling groups_inventory to load vars for managed_node1 15494 1726853354.57522: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853354.57535: Calling all_plugins_play to load vars for managed_node1 15494 1726853354.57539: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853354.57545: Calling groups_plugins_play to load vars for managed_node1 15494 1726853354.58488: done sending task result for task 02083763-bbaf-0028-1a50-000000000046 15494 1726853354.58491: WORKER PROCESS EXITING 15494 1726853354.60800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853354.62863: done with get_vars() 15494 1726853354.62910: done getting variables 15494 1726853354.62992: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:29:14 -0400 (0:00:00.731) 0:00:23.246 ****** 15494 1726853354.63039: entering _queue_task() for managed_node1/service 15494 1726853354.63410: worker is 1 (out of 1 available) 15494 1726853354.63422: exiting _queue_task() for managed_node1/service 15494 1726853354.63436: done queuing things up, now waiting for results queue to drain 15494 1726853354.63437: waiting for pending results... 15494 1726853354.63744: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15494 1726853354.63894: in run() - task 02083763-bbaf-0028-1a50-000000000047 15494 1726853354.63930: variable 'ansible_search_path' from source: unknown 15494 1726853354.63977: variable 'ansible_search_path' from source: unknown 15494 1726853354.63994: calling self._execute() 15494 1726853354.64097: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853354.64118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853354.64134: variable 'omit' from source: magic vars 15494 1726853354.64719: variable 'ansible_distribution_major_version' from source: facts 15494 1726853354.64722: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853354.64987: variable 'network_provider' from source: set_fact 15494 1726853354.65046: Evaluated conditional (network_provider == "nm"): True 15494 1726853354.65253: variable '__network_wpa_supplicant_required' from source: role '' defaults 15494 1726853354.65362: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15494 1726853354.65545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853354.69356: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853354.69502: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853354.69564: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853354.69617: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853354.69645: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853354.69811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853354.69854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853354.70121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853354.70160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853354.70377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853354.70380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853354.70382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853354.70384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853354.70537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853354.70562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853354.70616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853354.70660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853354.70896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853354.71057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853354.71089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853354.71532: variable 'network_connections' from source: play vars 15494 1726853354.71545: variable 'profile' from source: play vars 15494 1726853354.71640: variable 'profile' from source: play vars 15494 1726853354.71653: variable 'interface' from source: set_fact 15494 1726853354.71802: variable 'interface' from source: set_fact 15494 1726853354.72014: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853354.72236: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853354.72248: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853354.72312: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853354.72325: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853354.72391: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853354.72395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853354.72411: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853354.72530: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853354.72534: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853354.72877: variable 'network_connections' from source: play vars 15494 1726853354.72880: variable 'profile' from source: play vars 15494 1726853354.72882: variable 'profile' from source: play vars 15494 1726853354.72884: variable 'interface' from source: set_fact 15494 1726853354.72886: variable 'interface' from source: set_fact 15494 1726853354.72889: Evaluated conditional (__network_wpa_supplicant_required): False 15494 1726853354.72891: when evaluation is False, skipping this task 15494 1726853354.73183: _execute() done 15494 1726853354.73195: dumping result to json 15494 1726853354.73198: done dumping result, returning 15494 1726853354.73201: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-0028-1a50-000000000047] 15494 1726853354.73203: sending task result for task 02083763-bbaf-0028-1a50-000000000047 15494 1726853354.73298: done sending task result for task 02083763-bbaf-0028-1a50-000000000047 15494 1726853354.73301: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15494 1726853354.73422: no more pending results, returning what we have 15494 1726853354.73428: results queue empty 15494 1726853354.73429: checking for any_errors_fatal 15494 1726853354.73446: done checking for any_errors_fatal 15494 1726853354.73447: checking for max_fail_percentage 15494 1726853354.73449: done checking for max_fail_percentage 15494 1726853354.73450: checking to see if all hosts have failed and the running result is not ok 15494 1726853354.73451: done checking to see if all hosts have failed 15494 1726853354.73452: getting the remaining hosts for this loop 15494 1726853354.73454: done getting the remaining hosts for this loop 15494 1726853354.73458: getting the next task for host managed_node1 15494 1726853354.73465: done getting next task for host managed_node1 15494 1726853354.73468: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15494 1726853354.73470: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853354.73487: getting variables 15494 1726853354.73488: in VariableManager get_vars() 15494 1726853354.73527: Calling all_inventory to load vars for managed_node1 15494 1726853354.73529: Calling groups_inventory to load vars for managed_node1 15494 1726853354.73532: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853354.73541: Calling all_plugins_play to load vars for managed_node1 15494 1726853354.73544: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853354.73547: Calling groups_plugins_play to load vars for managed_node1 15494 1726853354.75623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853354.77175: done with get_vars() 15494 1726853354.77200: done getting variables 15494 1726853354.77264: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:29:14 -0400 (0:00:00.142) 0:00:23.388 ****** 15494 1726853354.77299: entering _queue_task() for managed_node1/service 15494 1726853354.77628: worker is 1 (out of 1 available) 15494 1726853354.77641: exiting _queue_task() for managed_node1/service 15494 1726853354.77656: done queuing things up, now waiting for results queue to drain 15494 1726853354.77658: waiting for pending results... 15494 1726853354.77931: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 15494 1726853354.78057: in run() - task 02083763-bbaf-0028-1a50-000000000048 15494 1726853354.78095: variable 'ansible_search_path' from source: unknown 15494 1726853354.78099: variable 'ansible_search_path' from source: unknown 15494 1726853354.78278: calling self._execute() 15494 1726853354.78281: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853354.78284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853354.78286: variable 'omit' from source: magic vars 15494 1726853354.78615: variable 'ansible_distribution_major_version' from source: facts 15494 1726853354.78636: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853354.78763: variable 'network_provider' from source: set_fact 15494 1726853354.78776: Evaluated conditional (network_provider == "initscripts"): False 15494 1726853354.78783: when evaluation is False, skipping this task 15494 1726853354.78789: _execute() done 15494 1726853354.78794: dumping result to json 15494 1726853354.78800: done dumping result, returning 15494 1726853354.78809: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-0028-1a50-000000000048] 15494 1726853354.78838: sending task result for task 02083763-bbaf-0028-1a50-000000000048 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15494 1726853354.79025: no more pending results, returning what we have 15494 1726853354.79030: results queue empty 15494 1726853354.79031: checking for any_errors_fatal 15494 1726853354.79041: done checking for any_errors_fatal 15494 1726853354.79042: checking for max_fail_percentage 15494 1726853354.79044: done checking for max_fail_percentage 15494 1726853354.79045: checking to see if all hosts have failed and the running result is not ok 15494 1726853354.79046: done checking to see if all hosts have failed 15494 1726853354.79049: getting the remaining hosts for this loop 15494 1726853354.79051: done getting the remaining hosts for this loop 15494 1726853354.79054: getting the next task for host managed_node1 15494 1726853354.79061: done getting next task for host managed_node1 15494 1726853354.79064: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15494 1726853354.79067: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853354.79085: getting variables 15494 1726853354.79087: in VariableManager get_vars() 15494 1726853354.79125: Calling all_inventory to load vars for managed_node1 15494 1726853354.79127: Calling groups_inventory to load vars for managed_node1 15494 1726853354.79130: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853354.79141: Calling all_plugins_play to load vars for managed_node1 15494 1726853354.79144: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853354.79150: Calling groups_plugins_play to load vars for managed_node1 15494 1726853354.79686: done sending task result for task 02083763-bbaf-0028-1a50-000000000048 15494 1726853354.79689: WORKER PROCESS EXITING 15494 1726853354.80606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853354.82251: done with get_vars() 15494 1726853354.82273: done getting variables 15494 1726853354.82330: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:29:14 -0400 (0:00:00.050) 0:00:23.439 ****** 15494 1726853354.82360: entering _queue_task() for managed_node1/copy 15494 1726853354.82626: worker is 1 (out of 1 available) 15494 1726853354.82637: exiting _queue_task() for managed_node1/copy 15494 1726853354.82648: done queuing things up, now waiting for results queue to drain 15494 1726853354.82649: waiting for pending results... 15494 1726853354.82915: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15494 1726853354.83022: in run() - task 02083763-bbaf-0028-1a50-000000000049 15494 1726853354.83040: variable 'ansible_search_path' from source: unknown 15494 1726853354.83046: variable 'ansible_search_path' from source: unknown 15494 1726853354.83084: calling self._execute() 15494 1726853354.83185: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853354.83197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853354.83215: variable 'omit' from source: magic vars 15494 1726853354.83589: variable 'ansible_distribution_major_version' from source: facts 15494 1726853354.83605: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853354.83722: variable 'network_provider' from source: set_fact 15494 1726853354.83733: Evaluated conditional (network_provider == "initscripts"): False 15494 1726853354.83740: when evaluation is False, skipping this task 15494 1726853354.83855: _execute() done 15494 1726853354.83859: dumping result to json 15494 1726853354.83862: done dumping result, returning 15494 1726853354.83866: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-0028-1a50-000000000049] 15494 1726853354.83868: sending task result for task 02083763-bbaf-0028-1a50-000000000049 15494 1726853354.83936: done sending task result for task 02083763-bbaf-0028-1a50-000000000049 15494 1726853354.83939: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15494 1726853354.83985: no more pending results, returning what we have 15494 1726853354.83989: results queue empty 15494 1726853354.83990: checking for any_errors_fatal 15494 1726853354.83994: done checking for any_errors_fatal 15494 1726853354.83995: checking for max_fail_percentage 15494 1726853354.83997: done checking for max_fail_percentage 15494 1726853354.83998: checking to see if all hosts have failed and the running result is not ok 15494 1726853354.83999: done checking to see if all hosts have failed 15494 1726853354.83999: getting the remaining hosts for this loop 15494 1726853354.84001: done getting the remaining hosts for this loop 15494 1726853354.84004: getting the next task for host managed_node1 15494 1726853354.84010: done getting next task for host managed_node1 15494 1726853354.84014: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15494 1726853354.84016: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853354.84032: getting variables 15494 1726853354.84033: in VariableManager get_vars() 15494 1726853354.84070: Calling all_inventory to load vars for managed_node1 15494 1726853354.84074: Calling groups_inventory to load vars for managed_node1 15494 1726853354.84077: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853354.84088: Calling all_plugins_play to load vars for managed_node1 15494 1726853354.84091: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853354.84094: Calling groups_plugins_play to load vars for managed_node1 15494 1726853354.85541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853354.87096: done with get_vars() 15494 1726853354.87119: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:29:14 -0400 (0:00:00.048) 0:00:23.487 ****** 15494 1726853354.87198: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15494 1726853354.87483: worker is 1 (out of 1 available) 15494 1726853354.87495: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15494 1726853354.87507: done queuing things up, now waiting for results queue to drain 15494 1726853354.87508: waiting for pending results... 15494 1726853354.87888: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15494 1726853354.87893: in run() - task 02083763-bbaf-0028-1a50-00000000004a 15494 1726853354.87906: variable 'ansible_search_path' from source: unknown 15494 1726853354.87913: variable 'ansible_search_path' from source: unknown 15494 1726853354.87951: calling self._execute() 15494 1726853354.88049: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853354.88061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853354.88078: variable 'omit' from source: magic vars 15494 1726853354.88453: variable 'ansible_distribution_major_version' from source: facts 15494 1726853354.88469: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853354.88482: variable 'omit' from source: magic vars 15494 1726853354.88527: variable 'omit' from source: magic vars 15494 1726853354.88694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853354.90780: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853354.90848: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853354.90889: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853354.90932: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853354.90964: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853354.91053: variable 'network_provider' from source: set_fact 15494 1726853354.91189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853354.91234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853354.91268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853354.91355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853354.91358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853354.91410: variable 'omit' from source: magic vars 15494 1726853354.91527: variable 'omit' from source: magic vars 15494 1726853354.91631: variable 'network_connections' from source: play vars 15494 1726853354.91648: variable 'profile' from source: play vars 15494 1726853354.91715: variable 'profile' from source: play vars 15494 1726853354.91726: variable 'interface' from source: set_fact 15494 1726853354.91876: variable 'interface' from source: set_fact 15494 1726853354.91933: variable 'omit' from source: magic vars 15494 1726853354.91944: variable '__lsr_ansible_managed' from source: task vars 15494 1726853354.92000: variable '__lsr_ansible_managed' from source: task vars 15494 1726853354.92165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15494 1726853354.93047: Loaded config def from plugin (lookup/template) 15494 1726853354.93109: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15494 1726853354.93285: File lookup term: get_ansible_managed.j2 15494 1726853354.93288: variable 'ansible_search_path' from source: unknown 15494 1726853354.93290: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15494 1726853354.93293: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15494 1726853354.93296: variable 'ansible_search_path' from source: unknown 15494 1726853355.01600: variable 'ansible_managed' from source: unknown 15494 1726853355.01729: variable 'omit' from source: magic vars 15494 1726853355.01760: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853355.01794: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853355.01815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853355.01840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853355.01853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853355.01886: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853355.01893: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853355.01900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853355.01997: Set connection var ansible_connection to ssh 15494 1726853355.02010: Set connection var ansible_pipelining to False 15494 1726853355.02020: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853355.02028: Set connection var ansible_shell_type to sh 15494 1726853355.02039: Set connection var ansible_timeout to 10 15494 1726853355.02056: Set connection var ansible_shell_executable to /bin/sh 15494 1726853355.02087: variable 'ansible_shell_executable' from source: unknown 15494 1726853355.02096: variable 'ansible_connection' from source: unknown 15494 1726853355.02103: variable 'ansible_module_compression' from source: unknown 15494 1726853355.02161: variable 'ansible_shell_type' from source: unknown 15494 1726853355.02165: variable 'ansible_shell_executable' from source: unknown 15494 1726853355.02167: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853355.02170: variable 'ansible_pipelining' from source: unknown 15494 1726853355.02173: variable 'ansible_timeout' from source: unknown 15494 1726853355.02176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853355.02276: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853355.02299: variable 'omit' from source: magic vars 15494 1726853355.02309: starting attempt loop 15494 1726853355.02317: running the handler 15494 1726853355.02333: _low_level_execute_command(): starting 15494 1726853355.02377: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853355.03093: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853355.03159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853355.03202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853355.03257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853355.05368: stdout chunk (state=3): >>>/root <<< 15494 1726853355.05374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853355.05377: stdout chunk (state=3): >>><<< 15494 1726853355.05379: stderr chunk (state=3): >>><<< 15494 1726853355.05381: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853355.05384: _low_level_execute_command(): starting 15494 1726853355.05387: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853355.0528357-16550-100630618071409 `" && echo ansible-tmp-1726853355.0528357-16550-100630618071409="` echo /root/.ansible/tmp/ansible-tmp-1726853355.0528357-16550-100630618071409 `" ) && sleep 0' 15494 1726853355.06627: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853355.06631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853355.06633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853355.06635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853355.06638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853355.06893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853355.06905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853355.06985: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853355.08848: stdout chunk (state=3): >>>ansible-tmp-1726853355.0528357-16550-100630618071409=/root/.ansible/tmp/ansible-tmp-1726853355.0528357-16550-100630618071409 <<< 15494 1726853355.08957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853355.09165: stderr chunk (state=3): >>><<< 15494 1726853355.09169: stdout chunk (state=3): >>><<< 15494 1726853355.09173: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853355.0528357-16550-100630618071409=/root/.ansible/tmp/ansible-tmp-1726853355.0528357-16550-100630618071409 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853355.09176: variable 'ansible_module_compression' from source: unknown 15494 1726853355.09376: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15494 1726853355.09380: variable 'ansible_facts' from source: unknown 15494 1726853355.09547: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853355.0528357-16550-100630618071409/AnsiballZ_network_connections.py 15494 1726853355.09834: Sending initial data 15494 1726853355.09934: Sent initial data (168 bytes) 15494 1726853355.11077: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853355.11080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853355.11096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853355.11226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853355.11244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853355.11394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853355.11518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853355.13050: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853355.13090: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853355.13130: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmprtwip1us /root/.ansible/tmp/ansible-tmp-1726853355.0528357-16550-100630618071409/AnsiballZ_network_connections.py <<< 15494 1726853355.13143: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853355.0528357-16550-100630618071409/AnsiballZ_network_connections.py" <<< 15494 1726853355.13176: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmprtwip1us" to remote "/root/.ansible/tmp/ansible-tmp-1726853355.0528357-16550-100630618071409/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853355.0528357-16550-100630618071409/AnsiballZ_network_connections.py" <<< 15494 1726853355.15178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853355.15181: stderr chunk (state=3): >>><<< 15494 1726853355.15184: stdout chunk (state=3): >>><<< 15494 1726853355.15186: done transferring module to remote 15494 1726853355.15188: _low_level_execute_command(): starting 15494 1726853355.15190: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853355.0528357-16550-100630618071409/ /root/.ansible/tmp/ansible-tmp-1726853355.0528357-16550-100630618071409/AnsiballZ_network_connections.py && sleep 0' 15494 1726853355.16393: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853355.16428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853355.16598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853355.16692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853355.17188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853355.18975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853355.18978: stdout chunk (state=3): >>><<< 15494 1726853355.18980: stderr chunk (state=3): >>><<< 15494 1726853355.18994: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853355.19001: _low_level_execute_command(): starting 15494 1726853355.19025: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853355.0528357-16550-100630618071409/AnsiballZ_network_connections.py && sleep 0' 15494 1726853355.20359: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853355.20486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853355.20670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853355.20689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853355.20765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853355.51539: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15494 1726853355.53645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853355.53664: stdout chunk (state=3): >>><<< 15494 1726853355.53679: stderr chunk (state=3): >>><<< 15494 1726853355.53700: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853355.53995: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853355.0528357-16550-100630618071409/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853355.53998: _low_level_execute_command(): starting 15494 1726853355.54005: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853355.0528357-16550-100630618071409/ > /dev/null 2>&1 && sleep 0' 15494 1726853355.55207: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853355.55221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853355.55231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853355.55294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853355.55324: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853355.55379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853355.58078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853355.58083: stdout chunk (state=3): >>><<< 15494 1726853355.58086: stderr chunk (state=3): >>><<< 15494 1726853355.58089: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853355.58091: handler run complete 15494 1726853355.58096: attempt loop complete, returning result 15494 1726853355.58099: _execute() done 15494 1726853355.58101: dumping result to json 15494 1726853355.58103: done dumping result, returning 15494 1726853355.58105: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-0028-1a50-00000000004a] 15494 1726853355.58107: sending task result for task 02083763-bbaf-0028-1a50-00000000004a 15494 1726853355.58390: done sending task result for task 02083763-bbaf-0028-1a50-00000000004a 15494 1726853355.58394: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15494 1726853355.58565: no more pending results, returning what we have 15494 1726853355.58569: results queue empty 15494 1726853355.58570: checking for any_errors_fatal 15494 1726853355.58781: done checking for any_errors_fatal 15494 1726853355.58783: checking for max_fail_percentage 15494 1726853355.58785: done checking for max_fail_percentage 15494 1726853355.58786: checking to see if all hosts have failed and the running result is not ok 15494 1726853355.58787: done checking to see if all hosts have failed 15494 1726853355.58788: getting the remaining hosts for this loop 15494 1726853355.58789: done getting the remaining hosts for this loop 15494 1726853355.58793: getting the next task for host managed_node1 15494 1726853355.58799: done getting next task for host managed_node1 15494 1726853355.58803: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15494 1726853355.58809: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853355.58819: getting variables 15494 1726853355.58821: in VariableManager get_vars() 15494 1726853355.58857: Calling all_inventory to load vars for managed_node1 15494 1726853355.58860: Calling groups_inventory to load vars for managed_node1 15494 1726853355.58862: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853355.59276: Calling all_plugins_play to load vars for managed_node1 15494 1726853355.59282: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853355.59286: Calling groups_plugins_play to load vars for managed_node1 15494 1726853355.61125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853355.62912: done with get_vars() 15494 1726853355.62939: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:29:15 -0400 (0:00:00.758) 0:00:24.246 ****** 15494 1726853355.63025: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15494 1726853355.64011: worker is 1 (out of 1 available) 15494 1726853355.64023: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15494 1726853355.64034: done queuing things up, now waiting for results queue to drain 15494 1726853355.64035: waiting for pending results... 15494 1726853355.64658: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 15494 1726853355.64678: in run() - task 02083763-bbaf-0028-1a50-00000000004b 15494 1726853355.64699: variable 'ansible_search_path' from source: unknown 15494 1726853355.64707: variable 'ansible_search_path' from source: unknown 15494 1726853355.64758: calling self._execute() 15494 1726853355.64867: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853355.64881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853355.64898: variable 'omit' from source: magic vars 15494 1726853355.65419: variable 'ansible_distribution_major_version' from source: facts 15494 1726853355.65438: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853355.65563: variable 'network_state' from source: role '' defaults 15494 1726853355.65584: Evaluated conditional (network_state != {}): False 15494 1726853355.65592: when evaluation is False, skipping this task 15494 1726853355.65611: _execute() done 15494 1726853355.65615: dumping result to json 15494 1726853355.65617: done dumping result, returning 15494 1726853355.65777: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-0028-1a50-00000000004b] 15494 1726853355.65781: sending task result for task 02083763-bbaf-0028-1a50-00000000004b 15494 1726853355.65848: done sending task result for task 02083763-bbaf-0028-1a50-00000000004b 15494 1726853355.65851: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15494 1726853355.65910: no more pending results, returning what we have 15494 1726853355.65915: results queue empty 15494 1726853355.65916: checking for any_errors_fatal 15494 1726853355.65927: done checking for any_errors_fatal 15494 1726853355.65928: checking for max_fail_percentage 15494 1726853355.65930: done checking for max_fail_percentage 15494 1726853355.65931: checking to see if all hosts have failed and the running result is not ok 15494 1726853355.65932: done checking to see if all hosts have failed 15494 1726853355.65933: getting the remaining hosts for this loop 15494 1726853355.65935: done getting the remaining hosts for this loop 15494 1726853355.65938: getting the next task for host managed_node1 15494 1726853355.65946: done getting next task for host managed_node1 15494 1726853355.65950: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15494 1726853355.65953: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853355.65968: getting variables 15494 1726853355.65970: in VariableManager get_vars() 15494 1726853355.66011: Calling all_inventory to load vars for managed_node1 15494 1726853355.66014: Calling groups_inventory to load vars for managed_node1 15494 1726853355.66017: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853355.66029: Calling all_plugins_play to load vars for managed_node1 15494 1726853355.66033: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853355.66037: Calling groups_plugins_play to load vars for managed_node1 15494 1726853355.68644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853355.70245: done with get_vars() 15494 1726853355.70272: done getting variables 15494 1726853355.70332: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:29:15 -0400 (0:00:00.073) 0:00:24.319 ****** 15494 1726853355.70364: entering _queue_task() for managed_node1/debug 15494 1726853355.71086: worker is 1 (out of 1 available) 15494 1726853355.71098: exiting _queue_task() for managed_node1/debug 15494 1726853355.71137: done queuing things up, now waiting for results queue to drain 15494 1726853355.71139: waiting for pending results... 15494 1726853355.71603: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15494 1726853355.71629: in run() - task 02083763-bbaf-0028-1a50-00000000004c 15494 1726853355.71744: variable 'ansible_search_path' from source: unknown 15494 1726853355.71978: variable 'ansible_search_path' from source: unknown 15494 1726853355.72148: calling self._execute() 15494 1726853355.72417: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853355.72420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853355.72498: variable 'omit' from source: magic vars 15494 1726853355.73719: variable 'ansible_distribution_major_version' from source: facts 15494 1726853355.73723: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853355.73726: variable 'omit' from source: magic vars 15494 1726853355.73778: variable 'omit' from source: magic vars 15494 1726853355.73826: variable 'omit' from source: magic vars 15494 1726853355.74122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853355.74318: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853355.74493: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853355.74497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853355.74503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853355.74681: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853355.74684: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853355.74693: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853355.74874: Set connection var ansible_connection to ssh 15494 1726853355.74904: Set connection var ansible_pipelining to False 15494 1726853355.74928: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853355.74937: Set connection var ansible_shell_type to sh 15494 1726853355.74948: Set connection var ansible_timeout to 10 15494 1726853355.74970: Set connection var ansible_shell_executable to /bin/sh 15494 1726853355.75085: variable 'ansible_shell_executable' from source: unknown 15494 1726853355.75090: variable 'ansible_connection' from source: unknown 15494 1726853355.75093: variable 'ansible_module_compression' from source: unknown 15494 1726853355.75095: variable 'ansible_shell_type' from source: unknown 15494 1726853355.75097: variable 'ansible_shell_executable' from source: unknown 15494 1726853355.75099: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853355.75101: variable 'ansible_pipelining' from source: unknown 15494 1726853355.75104: variable 'ansible_timeout' from source: unknown 15494 1726853355.75111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853355.75403: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853355.75408: variable 'omit' from source: magic vars 15494 1726853355.75416: starting attempt loop 15494 1726853355.75424: running the handler 15494 1726853355.75627: variable '__network_connections_result' from source: set_fact 15494 1726853355.75981: handler run complete 15494 1726853355.75984: attempt loop complete, returning result 15494 1726853355.75987: _execute() done 15494 1726853355.75989: dumping result to json 15494 1726853355.75991: done dumping result, returning 15494 1726853355.75994: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-0028-1a50-00000000004c] 15494 1726853355.75996: sending task result for task 02083763-bbaf-0028-1a50-00000000004c 15494 1726853355.76066: done sending task result for task 02083763-bbaf-0028-1a50-00000000004c 15494 1726853355.76070: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 15494 1726853355.76149: no more pending results, returning what we have 15494 1726853355.76153: results queue empty 15494 1726853355.76154: checking for any_errors_fatal 15494 1726853355.76161: done checking for any_errors_fatal 15494 1726853355.76162: checking for max_fail_percentage 15494 1726853355.76164: done checking for max_fail_percentage 15494 1726853355.76165: checking to see if all hosts have failed and the running result is not ok 15494 1726853355.76166: done checking to see if all hosts have failed 15494 1726853355.76166: getting the remaining hosts for this loop 15494 1726853355.76168: done getting the remaining hosts for this loop 15494 1726853355.76174: getting the next task for host managed_node1 15494 1726853355.76181: done getting next task for host managed_node1 15494 1726853355.76185: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15494 1726853355.76188: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853355.76198: getting variables 15494 1726853355.76200: in VariableManager get_vars() 15494 1726853355.76239: Calling all_inventory to load vars for managed_node1 15494 1726853355.76242: Calling groups_inventory to load vars for managed_node1 15494 1726853355.76244: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853355.76259: Calling all_plugins_play to load vars for managed_node1 15494 1726853355.76263: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853355.76266: Calling groups_plugins_play to load vars for managed_node1 15494 1726853355.78008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853355.80887: done with get_vars() 15494 1726853355.80911: done getting variables 15494 1726853355.81003: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:29:15 -0400 (0:00:00.106) 0:00:24.427 ****** 15494 1726853355.81131: entering _queue_task() for managed_node1/debug 15494 1726853355.81511: worker is 1 (out of 1 available) 15494 1726853355.81525: exiting _queue_task() for managed_node1/debug 15494 1726853355.81536: done queuing things up, now waiting for results queue to drain 15494 1726853355.81537: waiting for pending results... 15494 1726853355.81823: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15494 1726853355.81962: in run() - task 02083763-bbaf-0028-1a50-00000000004d 15494 1726853355.81986: variable 'ansible_search_path' from source: unknown 15494 1726853355.81997: variable 'ansible_search_path' from source: unknown 15494 1726853355.82039: calling self._execute() 15494 1726853355.82143: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853355.82157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853355.82210: variable 'omit' from source: magic vars 15494 1726853355.82702: variable 'ansible_distribution_major_version' from source: facts 15494 1726853355.82717: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853355.82743: variable 'omit' from source: magic vars 15494 1726853355.83005: variable 'omit' from source: magic vars 15494 1726853355.83008: variable 'omit' from source: magic vars 15494 1726853355.83011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853355.83014: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853355.83016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853355.83018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853355.83032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853355.83070: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853355.83081: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853355.83088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853355.83251: Set connection var ansible_connection to ssh 15494 1726853355.83264: Set connection var ansible_pipelining to False 15494 1726853355.83277: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853355.83285: Set connection var ansible_shell_type to sh 15494 1726853355.83297: Set connection var ansible_timeout to 10 15494 1726853355.83309: Set connection var ansible_shell_executable to /bin/sh 15494 1726853355.83339: variable 'ansible_shell_executable' from source: unknown 15494 1726853355.83353: variable 'ansible_connection' from source: unknown 15494 1726853355.83361: variable 'ansible_module_compression' from source: unknown 15494 1726853355.83368: variable 'ansible_shell_type' from source: unknown 15494 1726853355.83377: variable 'ansible_shell_executable' from source: unknown 15494 1726853355.83385: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853355.83392: variable 'ansible_pipelining' from source: unknown 15494 1726853355.83399: variable 'ansible_timeout' from source: unknown 15494 1726853355.83405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853355.83546: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853355.83570: variable 'omit' from source: magic vars 15494 1726853355.83584: starting attempt loop 15494 1726853355.83591: running the handler 15494 1726853355.83665: variable '__network_connections_result' from source: set_fact 15494 1726853355.83752: variable '__network_connections_result' from source: set_fact 15494 1726853355.83863: handler run complete 15494 1726853355.83976: attempt loop complete, returning result 15494 1726853355.83979: _execute() done 15494 1726853355.83982: dumping result to json 15494 1726853355.83984: done dumping result, returning 15494 1726853355.83987: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-0028-1a50-00000000004d] 15494 1726853355.83989: sending task result for task 02083763-bbaf-0028-1a50-00000000004d 15494 1726853355.84285: done sending task result for task 02083763-bbaf-0028-1a50-00000000004d 15494 1726853355.84289: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15494 1726853355.84373: no more pending results, returning what we have 15494 1726853355.84376: results queue empty 15494 1726853355.84377: checking for any_errors_fatal 15494 1726853355.84383: done checking for any_errors_fatal 15494 1726853355.84384: checking for max_fail_percentage 15494 1726853355.84385: done checking for max_fail_percentage 15494 1726853355.84386: checking to see if all hosts have failed and the running result is not ok 15494 1726853355.84387: done checking to see if all hosts have failed 15494 1726853355.84388: getting the remaining hosts for this loop 15494 1726853355.84389: done getting the remaining hosts for this loop 15494 1726853355.84392: getting the next task for host managed_node1 15494 1726853355.84397: done getting next task for host managed_node1 15494 1726853355.84401: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15494 1726853355.84403: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853355.84412: getting variables 15494 1726853355.84414: in VariableManager get_vars() 15494 1726853355.84457: Calling all_inventory to load vars for managed_node1 15494 1726853355.84460: Calling groups_inventory to load vars for managed_node1 15494 1726853355.84463: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853355.84521: Calling all_plugins_play to load vars for managed_node1 15494 1726853355.84525: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853355.84529: Calling groups_plugins_play to load vars for managed_node1 15494 1726853355.86724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853355.88557: done with get_vars() 15494 1726853355.88588: done getting variables 15494 1726853355.88651: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:29:15 -0400 (0:00:00.075) 0:00:24.502 ****** 15494 1726853355.88691: entering _queue_task() for managed_node1/debug 15494 1726853355.89033: worker is 1 (out of 1 available) 15494 1726853355.89046: exiting _queue_task() for managed_node1/debug 15494 1726853355.89061: done queuing things up, now waiting for results queue to drain 15494 1726853355.89062: waiting for pending results... 15494 1726853355.89340: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15494 1726853355.89456: in run() - task 02083763-bbaf-0028-1a50-00000000004e 15494 1726853355.89480: variable 'ansible_search_path' from source: unknown 15494 1726853355.89490: variable 'ansible_search_path' from source: unknown 15494 1726853355.89533: calling self._execute() 15494 1726853355.89638: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853355.89654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853355.89673: variable 'omit' from source: magic vars 15494 1726853355.90063: variable 'ansible_distribution_major_version' from source: facts 15494 1726853355.90082: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853355.90206: variable 'network_state' from source: role '' defaults 15494 1726853355.90255: Evaluated conditional (network_state != {}): False 15494 1726853355.90259: when evaluation is False, skipping this task 15494 1726853355.90261: _execute() done 15494 1726853355.90264: dumping result to json 15494 1726853355.90266: done dumping result, returning 15494 1726853355.90268: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-0028-1a50-00000000004e] 15494 1726853355.90272: sending task result for task 02083763-bbaf-0028-1a50-00000000004e skipping: [managed_node1] => { "false_condition": "network_state != {}" } 15494 1726853355.90519: no more pending results, returning what we have 15494 1726853355.90524: results queue empty 15494 1726853355.90525: checking for any_errors_fatal 15494 1726853355.90534: done checking for any_errors_fatal 15494 1726853355.90535: checking for max_fail_percentage 15494 1726853355.90537: done checking for max_fail_percentage 15494 1726853355.90538: checking to see if all hosts have failed and the running result is not ok 15494 1726853355.90539: done checking to see if all hosts have failed 15494 1726853355.90539: getting the remaining hosts for this loop 15494 1726853355.90541: done getting the remaining hosts for this loop 15494 1726853355.90545: getting the next task for host managed_node1 15494 1726853355.90552: done getting next task for host managed_node1 15494 1726853355.90556: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15494 1726853355.90558: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853355.90574: getting variables 15494 1726853355.90576: in VariableManager get_vars() 15494 1726853355.90613: Calling all_inventory to load vars for managed_node1 15494 1726853355.90616: Calling groups_inventory to load vars for managed_node1 15494 1726853355.90618: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853355.90631: Calling all_plugins_play to load vars for managed_node1 15494 1726853355.90634: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853355.90637: Calling groups_plugins_play to load vars for managed_node1 15494 1726853355.91345: done sending task result for task 02083763-bbaf-0028-1a50-00000000004e 15494 1726853355.91348: WORKER PROCESS EXITING 15494 1726853355.92229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853355.93848: done with get_vars() 15494 1726853355.93869: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:29:15 -0400 (0:00:00.052) 0:00:24.555 ****** 15494 1726853355.93968: entering _queue_task() for managed_node1/ping 15494 1726853355.94278: worker is 1 (out of 1 available) 15494 1726853355.94291: exiting _queue_task() for managed_node1/ping 15494 1726853355.94303: done queuing things up, now waiting for results queue to drain 15494 1726853355.94304: waiting for pending results... 15494 1726853355.94576: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15494 1726853355.94692: in run() - task 02083763-bbaf-0028-1a50-00000000004f 15494 1726853355.94717: variable 'ansible_search_path' from source: unknown 15494 1726853355.94726: variable 'ansible_search_path' from source: unknown 15494 1726853355.94765: calling self._execute() 15494 1726853355.94864: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853355.94920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853355.94924: variable 'omit' from source: magic vars 15494 1726853355.95278: variable 'ansible_distribution_major_version' from source: facts 15494 1726853355.95294: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853355.95304: variable 'omit' from source: magic vars 15494 1726853355.95343: variable 'omit' from source: magic vars 15494 1726853355.95390: variable 'omit' from source: magic vars 15494 1726853355.95464: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853355.95473: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853355.95497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853355.95516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853355.95529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853355.95560: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853355.95575: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853355.95680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853355.95684: Set connection var ansible_connection to ssh 15494 1726853355.95692: Set connection var ansible_pipelining to False 15494 1726853355.95703: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853355.95710: Set connection var ansible_shell_type to sh 15494 1726853355.95719: Set connection var ansible_timeout to 10 15494 1726853355.95731: Set connection var ansible_shell_executable to /bin/sh 15494 1726853355.95758: variable 'ansible_shell_executable' from source: unknown 15494 1726853355.95766: variable 'ansible_connection' from source: unknown 15494 1726853355.95776: variable 'ansible_module_compression' from source: unknown 15494 1726853355.95790: variable 'ansible_shell_type' from source: unknown 15494 1726853355.95797: variable 'ansible_shell_executable' from source: unknown 15494 1726853355.95804: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853355.95812: variable 'ansible_pipelining' from source: unknown 15494 1726853355.95818: variable 'ansible_timeout' from source: unknown 15494 1726853355.95827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853355.96021: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853355.96036: variable 'omit' from source: magic vars 15494 1726853355.96044: starting attempt loop 15494 1726853355.96050: running the handler 15494 1726853355.96066: _low_level_execute_command(): starting 15494 1726853355.96079: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853355.96798: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853355.96883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853355.96920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853355.96942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853355.96955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853355.97041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853355.98735: stdout chunk (state=3): >>>/root <<< 15494 1726853355.99130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853355.99133: stdout chunk (state=3): >>><<< 15494 1726853355.99136: stderr chunk (state=3): >>><<< 15494 1726853355.99139: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853355.99142: _low_level_execute_command(): starting 15494 1726853355.99144: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853355.9906476-16592-38657082183409 `" && echo ansible-tmp-1726853355.9906476-16592-38657082183409="` echo /root/.ansible/tmp/ansible-tmp-1726853355.9906476-16592-38657082183409 `" ) && sleep 0' 15494 1726853356.00505: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853356.00660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853356.01039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853356.02897: stdout chunk (state=3): >>>ansible-tmp-1726853355.9906476-16592-38657082183409=/root/.ansible/tmp/ansible-tmp-1726853355.9906476-16592-38657082183409 <<< 15494 1726853356.03079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853356.03082: stdout chunk (state=3): >>><<< 15494 1726853356.03084: stderr chunk (state=3): >>><<< 15494 1726853356.03087: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853355.9906476-16592-38657082183409=/root/.ansible/tmp/ansible-tmp-1726853355.9906476-16592-38657082183409 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853356.03111: variable 'ansible_module_compression' from source: unknown 15494 1726853356.03152: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15494 1726853356.03304: variable 'ansible_facts' from source: unknown 15494 1726853356.03507: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853355.9906476-16592-38657082183409/AnsiballZ_ping.py 15494 1726853356.03875: Sending initial data 15494 1726853356.03878: Sent initial data (152 bytes) 15494 1726853356.04773: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853356.04984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853356.05011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853356.05093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853356.05156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853356.06801: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853356.06875: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853356.06907: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853355.9906476-16592-38657082183409/AnsiballZ_ping.py" <<< 15494 1726853356.06911: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpmyczozff /root/.ansible/tmp/ansible-tmp-1726853355.9906476-16592-38657082183409/AnsiballZ_ping.py <<< 15494 1726853356.07016: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpmyczozff" to remote "/root/.ansible/tmp/ansible-tmp-1726853355.9906476-16592-38657082183409/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853355.9906476-16592-38657082183409/AnsiballZ_ping.py" <<< 15494 1726853356.08001: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853356.08004: stderr chunk (state=3): >>><<< 15494 1726853356.08009: stdout chunk (state=3): >>><<< 15494 1726853356.08084: done transferring module to remote 15494 1726853356.08096: _low_level_execute_command(): starting 15494 1726853356.08102: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853355.9906476-16592-38657082183409/ /root/.ansible/tmp/ansible-tmp-1726853355.9906476-16592-38657082183409/AnsiballZ_ping.py && sleep 0' 15494 1726853356.09449: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853356.09453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853356.09456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853356.09458: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853356.09461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853356.09463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853356.09622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853356.09626: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853356.09628: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853356.09693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853356.11464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853356.11467: stderr chunk (state=3): >>><<< 15494 1726853356.11470: stdout chunk (state=3): >>><<< 15494 1726853356.11491: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853356.11494: _low_level_execute_command(): starting 15494 1726853356.11499: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853355.9906476-16592-38657082183409/AnsiballZ_ping.py && sleep 0' 15494 1726853356.12739: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853356.12748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853356.12892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853356.12896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853356.12898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853356.12901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853356.12903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853356.13160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853356.13211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853356.28102: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15494 1726853356.29493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853356.29497: stdout chunk (state=3): >>><<< 15494 1726853356.29500: stderr chunk (state=3): >>><<< 15494 1726853356.29592: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853356.29615: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853355.9906476-16592-38657082183409/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853356.29624: _low_level_execute_command(): starting 15494 1726853356.29629: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853355.9906476-16592-38657082183409/ > /dev/null 2>&1 && sleep 0' 15494 1726853356.30677: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853356.30700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853356.30703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853356.30744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853356.30760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853356.30768: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853356.30809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853356.30812: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853356.30850: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853356.30897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853356.30906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853356.30979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853356.32867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853356.32874: stdout chunk (state=3): >>><<< 15494 1726853356.32877: stderr chunk (state=3): >>><<< 15494 1726853356.33284: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853356.33292: handler run complete 15494 1726853356.33295: attempt loop complete, returning result 15494 1726853356.33298: _execute() done 15494 1726853356.33300: dumping result to json 15494 1726853356.33302: done dumping result, returning 15494 1726853356.33305: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-0028-1a50-00000000004f] 15494 1726853356.33307: sending task result for task 02083763-bbaf-0028-1a50-00000000004f 15494 1726853356.33373: done sending task result for task 02083763-bbaf-0028-1a50-00000000004f 15494 1726853356.33376: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 15494 1726853356.33447: no more pending results, returning what we have 15494 1726853356.33451: results queue empty 15494 1726853356.33452: checking for any_errors_fatal 15494 1726853356.33459: done checking for any_errors_fatal 15494 1726853356.33460: checking for max_fail_percentage 15494 1726853356.33461: done checking for max_fail_percentage 15494 1726853356.33462: checking to see if all hosts have failed and the running result is not ok 15494 1726853356.33463: done checking to see if all hosts have failed 15494 1726853356.33464: getting the remaining hosts for this loop 15494 1726853356.33466: done getting the remaining hosts for this loop 15494 1726853356.33469: getting the next task for host managed_node1 15494 1726853356.33480: done getting next task for host managed_node1 15494 1726853356.33483: ^ task is: TASK: meta (role_complete) 15494 1726853356.33485: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853356.33497: getting variables 15494 1726853356.33500: in VariableManager get_vars() 15494 1726853356.33536: Calling all_inventory to load vars for managed_node1 15494 1726853356.33538: Calling groups_inventory to load vars for managed_node1 15494 1726853356.33540: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853356.33550: Calling all_plugins_play to load vars for managed_node1 15494 1726853356.33553: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853356.33556: Calling groups_plugins_play to load vars for managed_node1 15494 1726853356.35044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853356.37108: done with get_vars() 15494 1726853356.37129: done getting variables 15494 1726853356.37215: done queuing things up, now waiting for results queue to drain 15494 1726853356.37217: results queue empty 15494 1726853356.37217: checking for any_errors_fatal 15494 1726853356.37220: done checking for any_errors_fatal 15494 1726853356.37220: checking for max_fail_percentage 15494 1726853356.37222: done checking for max_fail_percentage 15494 1726853356.37222: checking to see if all hosts have failed and the running result is not ok 15494 1726853356.37223: done checking to see if all hosts have failed 15494 1726853356.37224: getting the remaining hosts for this loop 15494 1726853356.37224: done getting the remaining hosts for this loop 15494 1726853356.37227: getting the next task for host managed_node1 15494 1726853356.37230: done getting next task for host managed_node1 15494 1726853356.37232: ^ task is: TASK: meta (flush_handlers) 15494 1726853356.37233: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853356.37236: getting variables 15494 1726853356.37237: in VariableManager get_vars() 15494 1726853356.37251: Calling all_inventory to load vars for managed_node1 15494 1726853356.37253: Calling groups_inventory to load vars for managed_node1 15494 1726853356.37255: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853356.37264: Calling all_plugins_play to load vars for managed_node1 15494 1726853356.37267: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853356.37273: Calling groups_plugins_play to load vars for managed_node1 15494 1726853356.39208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853356.41183: done with get_vars() 15494 1726853356.41209: done getting variables 15494 1726853356.41258: in VariableManager get_vars() 15494 1726853356.41288: Calling all_inventory to load vars for managed_node1 15494 1726853356.41291: Calling groups_inventory to load vars for managed_node1 15494 1726853356.41293: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853356.41315: Calling all_plugins_play to load vars for managed_node1 15494 1726853356.41328: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853356.41332: Calling groups_plugins_play to load vars for managed_node1 15494 1726853356.42680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853356.44364: done with get_vars() 15494 1726853356.44401: done queuing things up, now waiting for results queue to drain 15494 1726853356.44403: results queue empty 15494 1726853356.44404: checking for any_errors_fatal 15494 1726853356.44405: done checking for any_errors_fatal 15494 1726853356.44406: checking for max_fail_percentage 15494 1726853356.44407: done checking for max_fail_percentage 15494 1726853356.44408: checking to see if all hosts have failed and the running result is not ok 15494 1726853356.44409: done checking to see if all hosts have failed 15494 1726853356.44409: getting the remaining hosts for this loop 15494 1726853356.44410: done getting the remaining hosts for this loop 15494 1726853356.44413: getting the next task for host managed_node1 15494 1726853356.44417: done getting next task for host managed_node1 15494 1726853356.44418: ^ task is: TASK: meta (flush_handlers) 15494 1726853356.44420: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853356.44428: getting variables 15494 1726853356.44429: in VariableManager get_vars() 15494 1726853356.44440: Calling all_inventory to load vars for managed_node1 15494 1726853356.44500: Calling groups_inventory to load vars for managed_node1 15494 1726853356.44503: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853356.44508: Calling all_plugins_play to load vars for managed_node1 15494 1726853356.44511: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853356.44615: Calling groups_plugins_play to load vars for managed_node1 15494 1726853356.46893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853356.48802: done with get_vars() 15494 1726853356.48844: done getting variables 15494 1726853356.48904: in VariableManager get_vars() 15494 1726853356.48918: Calling all_inventory to load vars for managed_node1 15494 1726853356.48921: Calling groups_inventory to load vars for managed_node1 15494 1726853356.48923: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853356.48927: Calling all_plugins_play to load vars for managed_node1 15494 1726853356.48930: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853356.48932: Calling groups_plugins_play to load vars for managed_node1 15494 1726853356.50578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853356.53200: done with get_vars() 15494 1726853356.53234: done queuing things up, now waiting for results queue to drain 15494 1726853356.53242: results queue empty 15494 1726853356.53243: checking for any_errors_fatal 15494 1726853356.53244: done checking for any_errors_fatal 15494 1726853356.53245: checking for max_fail_percentage 15494 1726853356.53246: done checking for max_fail_percentage 15494 1726853356.53249: checking to see if all hosts have failed and the running result is not ok 15494 1726853356.53250: done checking to see if all hosts have failed 15494 1726853356.53251: getting the remaining hosts for this loop 15494 1726853356.53252: done getting the remaining hosts for this loop 15494 1726853356.53255: getting the next task for host managed_node1 15494 1726853356.53259: done getting next task for host managed_node1 15494 1726853356.53260: ^ task is: None 15494 1726853356.53261: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853356.53262: done queuing things up, now waiting for results queue to drain 15494 1726853356.53263: results queue empty 15494 1726853356.53264: checking for any_errors_fatal 15494 1726853356.53264: done checking for any_errors_fatal 15494 1726853356.53265: checking for max_fail_percentage 15494 1726853356.53266: done checking for max_fail_percentage 15494 1726853356.53266: checking to see if all hosts have failed and the running result is not ok 15494 1726853356.53267: done checking to see if all hosts have failed 15494 1726853356.53268: getting the next task for host managed_node1 15494 1726853356.53274: done getting next task for host managed_node1 15494 1726853356.53275: ^ task is: None 15494 1726853356.53276: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853356.53323: in VariableManager get_vars() 15494 1726853356.53339: done with get_vars() 15494 1726853356.53344: in VariableManager get_vars() 15494 1726853356.53381: done with get_vars() 15494 1726853356.53385: variable 'omit' from source: magic vars 15494 1726853356.53437: in VariableManager get_vars() 15494 1726853356.53520: done with get_vars() 15494 1726853356.53546: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 15494 1726853356.53956: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15494 1726853356.54062: getting the remaining hosts for this loop 15494 1726853356.54064: done getting the remaining hosts for this loop 15494 1726853356.54066: getting the next task for host managed_node1 15494 1726853356.54068: done getting next task for host managed_node1 15494 1726853356.54072: ^ task is: TASK: Gathering Facts 15494 1726853356.54073: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853356.54075: getting variables 15494 1726853356.54076: in VariableManager get_vars() 15494 1726853356.54084: Calling all_inventory to load vars for managed_node1 15494 1726853356.54089: Calling groups_inventory to load vars for managed_node1 15494 1726853356.54094: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853356.54099: Calling all_plugins_play to load vars for managed_node1 15494 1726853356.54101: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853356.54104: Calling groups_plugins_play to load vars for managed_node1 15494 1726853356.55991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853356.64321: done with get_vars() 15494 1726853356.64344: done getting variables 15494 1726853356.64434: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 13:29:16 -0400 (0:00:00.704) 0:00:25.260 ****** 15494 1726853356.64460: entering _queue_task() for managed_node1/gather_facts 15494 1726853356.65060: worker is 1 (out of 1 available) 15494 1726853356.65070: exiting _queue_task() for managed_node1/gather_facts 15494 1726853356.65116: done queuing things up, now waiting for results queue to drain 15494 1726853356.65118: waiting for pending results... 15494 1726853356.65414: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15494 1726853356.65743: in run() - task 02083763-bbaf-0028-1a50-000000000382 15494 1726853356.65750: variable 'ansible_search_path' from source: unknown 15494 1726853356.65753: calling self._execute() 15494 1726853356.65919: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853356.65936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853356.65958: variable 'omit' from source: magic vars 15494 1726853356.66638: variable 'ansible_distribution_major_version' from source: facts 15494 1726853356.66660: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853356.66713: variable 'omit' from source: magic vars 15494 1726853356.66717: variable 'omit' from source: magic vars 15494 1726853356.66764: variable 'omit' from source: magic vars 15494 1726853356.66809: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853356.66866: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853356.66893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853356.66930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853356.66934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853356.66977: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853356.67065: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853356.67069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853356.67124: Set connection var ansible_connection to ssh 15494 1726853356.67139: Set connection var ansible_pipelining to False 15494 1726853356.67153: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853356.67161: Set connection var ansible_shell_type to sh 15494 1726853356.67170: Set connection var ansible_timeout to 10 15494 1726853356.67192: Set connection var ansible_shell_executable to /bin/sh 15494 1726853356.67276: variable 'ansible_shell_executable' from source: unknown 15494 1726853356.67280: variable 'ansible_connection' from source: unknown 15494 1726853356.67283: variable 'ansible_module_compression' from source: unknown 15494 1726853356.67285: variable 'ansible_shell_type' from source: unknown 15494 1726853356.67287: variable 'ansible_shell_executable' from source: unknown 15494 1726853356.67299: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853356.67303: variable 'ansible_pipelining' from source: unknown 15494 1726853356.67305: variable 'ansible_timeout' from source: unknown 15494 1726853356.67307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853356.67477: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853356.67516: variable 'omit' from source: magic vars 15494 1726853356.67520: starting attempt loop 15494 1726853356.67525: running the handler 15494 1726853356.67540: variable 'ansible_facts' from source: unknown 15494 1726853356.67625: _low_level_execute_command(): starting 15494 1726853356.67628: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853356.68407: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853356.68508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853356.68527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853356.68603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853356.70349: stdout chunk (state=3): >>>/root <<< 15494 1726853356.70550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853356.70553: stdout chunk (state=3): >>><<< 15494 1726853356.70557: stderr chunk (state=3): >>><<< 15494 1726853356.70684: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853356.70690: _low_level_execute_command(): starting 15494 1726853356.70694: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853356.705884-16610-266768130133400 `" && echo ansible-tmp-1726853356.705884-16610-266768130133400="` echo /root/.ansible/tmp/ansible-tmp-1726853356.705884-16610-266768130133400 `" ) && sleep 0' 15494 1726853356.71755: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853356.71759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853356.71762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853356.71765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853356.71968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853356.72097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853356.72405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853356.74355: stdout chunk (state=3): >>>ansible-tmp-1726853356.705884-16610-266768130133400=/root/.ansible/tmp/ansible-tmp-1726853356.705884-16610-266768130133400 <<< 15494 1726853356.74577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853356.74581: stdout chunk (state=3): >>><<< 15494 1726853356.74584: stderr chunk (state=3): >>><<< 15494 1726853356.74587: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853356.705884-16610-266768130133400=/root/.ansible/tmp/ansible-tmp-1726853356.705884-16610-266768130133400 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853356.74590: variable 'ansible_module_compression' from source: unknown 15494 1726853356.74604: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15494 1726853356.74666: variable 'ansible_facts' from source: unknown 15494 1726853356.74869: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853356.705884-16610-266768130133400/AnsiballZ_setup.py 15494 1726853356.75087: Sending initial data 15494 1726853356.75091: Sent initial data (153 bytes) 15494 1726853356.75586: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853356.75591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853356.75677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853356.75692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853356.75707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853356.75718: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853356.75726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853356.75807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853356.77609: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853356.77648: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853356.77697: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpkywmbvl8 /root/.ansible/tmp/ansible-tmp-1726853356.705884-16610-266768130133400/AnsiballZ_setup.py <<< 15494 1726853356.77701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853356.705884-16610-266768130133400/AnsiballZ_setup.py" <<< 15494 1726853356.77760: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpkywmbvl8" to remote "/root/.ansible/tmp/ansible-tmp-1726853356.705884-16610-266768130133400/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853356.705884-16610-266768130133400/AnsiballZ_setup.py" <<< 15494 1726853356.79425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853356.79583: stdout chunk (state=3): >>><<< 15494 1726853356.79587: stderr chunk (state=3): >>><<< 15494 1726853356.79589: done transferring module to remote 15494 1726853356.79592: _low_level_execute_command(): starting 15494 1726853356.79594: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853356.705884-16610-266768130133400/ /root/.ansible/tmp/ansible-tmp-1726853356.705884-16610-266768130133400/AnsiballZ_setup.py && sleep 0' 15494 1726853356.80441: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853356.80445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853356.80448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853356.80450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853356.80452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853356.80455: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853356.80457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853356.80459: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853356.80462: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 15494 1726853356.80464: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15494 1726853356.80466: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853356.80469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853356.80473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853356.80475: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853356.80477: stderr chunk (state=3): >>>debug2: match found <<< 15494 1726853356.80479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853356.80486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853356.80537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853356.80578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853356.82452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853356.82513: stderr chunk (state=3): >>><<< 15494 1726853356.82528: stdout chunk (state=3): >>><<< 15494 1726853356.82543: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853356.82617: _low_level_execute_command(): starting 15494 1726853356.82620: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853356.705884-16610-266768130133400/AnsiballZ_setup.py && sleep 0' 15494 1726853356.83180: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853356.83195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853356.83223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853356.83240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853356.83258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853356.83272: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853356.83287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853356.83331: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853356.83401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853356.83441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853356.83457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853356.83547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853357.48784: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_loadavg": {"1m": 0.611328125, "5m": 0.36767578125, "15m": 0.1611328125}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "17", "epoch": "1726853357", "epoch_int": "1726853357", "date": "2024-09-20", "time": "13:29:17", "iso8601_micro": "2024-09-20T17:29:17.109324Z", "iso8601": "2024-09-20T17:29:17Z", "iso8601_basic": "20240920T132917109324", "iso8601_basic_short": "20240920T132917", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2944, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 587, "free": 2944}, "nocache": {"free": 3281, "used": 250}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 523, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797109760, "block_size": 4096, "block_total": 65519099, "block_available": 63915310, "block_used": 1603789, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15494 1726853357.50754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853357.50833: stdout chunk (state=3): >>><<< 15494 1726853357.51077: stderr chunk (state=3): >>><<< 15494 1726853357.51082: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_loadavg": {"1m": 0.611328125, "5m": 0.36767578125, "15m": 0.1611328125}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "17", "epoch": "1726853357", "epoch_int": "1726853357", "date": "2024-09-20", "time": "13:29:17", "iso8601_micro": "2024-09-20T17:29:17.109324Z", "iso8601": "2024-09-20T17:29:17Z", "iso8601_basic": "20240920T132917109324", "iso8601_basic_short": "20240920T132917", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2944, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 587, "free": 2944}, "nocache": {"free": 3281, "used": 250}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 523, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797109760, "block_size": 4096, "block_total": 65519099, "block_available": 63915310, "block_used": 1603789, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853357.52793: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853356.705884-16610-266768130133400/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853357.52797: _low_level_execute_command(): starting 15494 1726853357.52800: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853356.705884-16610-266768130133400/ > /dev/null 2>&1 && sleep 0' 15494 1726853357.53867: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853357.54186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853357.54387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853357.54455: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853357.56339: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853357.56492: stderr chunk (state=3): >>><<< 15494 1726853357.56495: stdout chunk (state=3): >>><<< 15494 1726853357.56515: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853357.56526: handler run complete 15494 1726853357.56855: variable 'ansible_facts' from source: unknown 15494 1726853357.56973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853357.57677: variable 'ansible_facts' from source: unknown 15494 1726853357.57841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853357.58083: attempt loop complete, returning result 15494 1726853357.58086: _execute() done 15494 1726853357.58090: dumping result to json 15494 1726853357.58156: done dumping result, returning 15494 1726853357.58159: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-0028-1a50-000000000382] 15494 1726853357.58166: sending task result for task 02083763-bbaf-0028-1a50-000000000382 15494 1726853357.59182: done sending task result for task 02083763-bbaf-0028-1a50-000000000382 15494 1726853357.59185: WORKER PROCESS EXITING ok: [managed_node1] 15494 1726853357.59949: no more pending results, returning what we have 15494 1726853357.59952: results queue empty 15494 1726853357.59953: checking for any_errors_fatal 15494 1726853357.59954: done checking for any_errors_fatal 15494 1726853357.59955: checking for max_fail_percentage 15494 1726853357.59957: done checking for max_fail_percentage 15494 1726853357.59957: checking to see if all hosts have failed and the running result is not ok 15494 1726853357.59960: done checking to see if all hosts have failed 15494 1726853357.59961: getting the remaining hosts for this loop 15494 1726853357.59962: done getting the remaining hosts for this loop 15494 1726853357.59965: getting the next task for host managed_node1 15494 1726853357.59970: done getting next task for host managed_node1 15494 1726853357.60175: ^ task is: TASK: meta (flush_handlers) 15494 1726853357.60178: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853357.60183: getting variables 15494 1726853357.60184: in VariableManager get_vars() 15494 1726853357.60206: Calling all_inventory to load vars for managed_node1 15494 1726853357.60209: Calling groups_inventory to load vars for managed_node1 15494 1726853357.60212: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853357.60222: Calling all_plugins_play to load vars for managed_node1 15494 1726853357.60226: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853357.60229: Calling groups_plugins_play to load vars for managed_node1 15494 1726853357.62079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853357.63710: done with get_vars() 15494 1726853357.63732: done getting variables 15494 1726853357.63803: in VariableManager get_vars() 15494 1726853357.63813: Calling all_inventory to load vars for managed_node1 15494 1726853357.63816: Calling groups_inventory to load vars for managed_node1 15494 1726853357.63818: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853357.63823: Calling all_plugins_play to load vars for managed_node1 15494 1726853357.63826: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853357.63828: Calling groups_plugins_play to load vars for managed_node1 15494 1726853357.65173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853357.67650: done with get_vars() 15494 1726853357.67688: done queuing things up, now waiting for results queue to drain 15494 1726853357.67690: results queue empty 15494 1726853357.67691: checking for any_errors_fatal 15494 1726853357.67695: done checking for any_errors_fatal 15494 1726853357.67696: checking for max_fail_percentage 15494 1726853357.67697: done checking for max_fail_percentage 15494 1726853357.67698: checking to see if all hosts have failed and the running result is not ok 15494 1726853357.67699: done checking to see if all hosts have failed 15494 1726853357.67704: getting the remaining hosts for this loop 15494 1726853357.67705: done getting the remaining hosts for this loop 15494 1726853357.67708: getting the next task for host managed_node1 15494 1726853357.67713: done getting next task for host managed_node1 15494 1726853357.67715: ^ task is: TASK: Include the task 'delete_interface.yml' 15494 1726853357.67717: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853357.67719: getting variables 15494 1726853357.67720: in VariableManager get_vars() 15494 1726853357.67731: Calling all_inventory to load vars for managed_node1 15494 1726853357.67734: Calling groups_inventory to load vars for managed_node1 15494 1726853357.67736: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853357.67742: Calling all_plugins_play to load vars for managed_node1 15494 1726853357.67745: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853357.67748: Calling groups_plugins_play to load vars for managed_node1 15494 1726853357.68900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853357.70483: done with get_vars() 15494 1726853357.70650: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 13:29:17 -0400 (0:00:01.064) 0:00:26.325 ****** 15494 1726853357.70927: entering _queue_task() for managed_node1/include_tasks 15494 1726853357.71589: worker is 1 (out of 1 available) 15494 1726853357.71599: exiting _queue_task() for managed_node1/include_tasks 15494 1726853357.71610: done queuing things up, now waiting for results queue to drain 15494 1726853357.71611: waiting for pending results... 15494 1726853357.71717: running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' 15494 1726853357.71956: in run() - task 02083763-bbaf-0028-1a50-000000000052 15494 1726853357.71965: variable 'ansible_search_path' from source: unknown 15494 1726853357.71976: calling self._execute() 15494 1726853357.72065: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853357.72084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853357.72144: variable 'omit' from source: magic vars 15494 1726853357.72433: variable 'ansible_distribution_major_version' from source: facts 15494 1726853357.72441: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853357.72447: _execute() done 15494 1726853357.72453: dumping result to json 15494 1726853357.72483: done dumping result, returning 15494 1726853357.72488: done running TaskExecutor() for managed_node1/TASK: Include the task 'delete_interface.yml' [02083763-bbaf-0028-1a50-000000000052] 15494 1726853357.72491: sending task result for task 02083763-bbaf-0028-1a50-000000000052 15494 1726853357.72577: done sending task result for task 02083763-bbaf-0028-1a50-000000000052 15494 1726853357.72579: WORKER PROCESS EXITING 15494 1726853357.72603: no more pending results, returning what we have 15494 1726853357.72608: in VariableManager get_vars() 15494 1726853357.72643: Calling all_inventory to load vars for managed_node1 15494 1726853357.72645: Calling groups_inventory to load vars for managed_node1 15494 1726853357.72649: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853357.72662: Calling all_plugins_play to load vars for managed_node1 15494 1726853357.72664: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853357.72667: Calling groups_plugins_play to load vars for managed_node1 15494 1726853357.75040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853357.77830: done with get_vars() 15494 1726853357.78018: variable 'ansible_search_path' from source: unknown 15494 1726853357.78073: we have included files to process 15494 1726853357.78075: generating all_blocks data 15494 1726853357.78076: done generating all_blocks data 15494 1726853357.78077: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15494 1726853357.78078: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15494 1726853357.78141: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15494 1726853357.78516: done processing included file 15494 1726853357.78521: iterating over new_blocks loaded from include file 15494 1726853357.78523: in VariableManager get_vars() 15494 1726853357.78537: done with get_vars() 15494 1726853357.78539: filtering new block on tags 15494 1726853357.78560: done filtering new block on tags 15494 1726853357.78563: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node1 15494 1726853357.78568: extending task lists for all hosts with included blocks 15494 1726853357.78610: done extending task lists 15494 1726853357.78611: done processing included files 15494 1726853357.78612: results queue empty 15494 1726853357.78612: checking for any_errors_fatal 15494 1726853357.78614: done checking for any_errors_fatal 15494 1726853357.78615: checking for max_fail_percentage 15494 1726853357.78616: done checking for max_fail_percentage 15494 1726853357.78617: checking to see if all hosts have failed and the running result is not ok 15494 1726853357.78617: done checking to see if all hosts have failed 15494 1726853357.78618: getting the remaining hosts for this loop 15494 1726853357.78619: done getting the remaining hosts for this loop 15494 1726853357.78622: getting the next task for host managed_node1 15494 1726853357.78626: done getting next task for host managed_node1 15494 1726853357.78629: ^ task is: TASK: Remove test interface if necessary 15494 1726853357.78631: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853357.78633: getting variables 15494 1726853357.78634: in VariableManager get_vars() 15494 1726853357.78642: Calling all_inventory to load vars for managed_node1 15494 1726853357.78644: Calling groups_inventory to load vars for managed_node1 15494 1726853357.78649: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853357.78655: Calling all_plugins_play to load vars for managed_node1 15494 1726853357.78657: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853357.78660: Calling groups_plugins_play to load vars for managed_node1 15494 1726853357.79997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853357.82013: done with get_vars() 15494 1726853357.82039: done getting variables 15494 1726853357.82097: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 13:29:17 -0400 (0:00:00.111) 0:00:26.437 ****** 15494 1726853357.82126: entering _queue_task() for managed_node1/command 15494 1726853357.82540: worker is 1 (out of 1 available) 15494 1726853357.82559: exiting _queue_task() for managed_node1/command 15494 1726853357.82570: done queuing things up, now waiting for results queue to drain 15494 1726853357.82575: waiting for pending results... 15494 1726853357.82867: running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary 15494 1726853357.82922: in run() - task 02083763-bbaf-0028-1a50-000000000393 15494 1726853357.82934: variable 'ansible_search_path' from source: unknown 15494 1726853357.82938: variable 'ansible_search_path' from source: unknown 15494 1726853357.82966: calling self._execute() 15494 1726853357.83083: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853357.83087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853357.83090: variable 'omit' from source: magic vars 15494 1726853357.83482: variable 'ansible_distribution_major_version' from source: facts 15494 1726853357.83485: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853357.83488: variable 'omit' from source: magic vars 15494 1726853357.83491: variable 'omit' from source: magic vars 15494 1726853357.83575: variable 'interface' from source: set_fact 15494 1726853357.83579: variable 'omit' from source: magic vars 15494 1726853357.83595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853357.83697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853357.83701: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853357.83703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853357.83708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853357.83711: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853357.83714: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853357.83716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853357.83835: Set connection var ansible_connection to ssh 15494 1726853357.83839: Set connection var ansible_pipelining to False 15494 1726853357.83841: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853357.83843: Set connection var ansible_shell_type to sh 15494 1726853357.83846: Set connection var ansible_timeout to 10 15494 1726853357.83851: Set connection var ansible_shell_executable to /bin/sh 15494 1726853357.83945: variable 'ansible_shell_executable' from source: unknown 15494 1726853357.83952: variable 'ansible_connection' from source: unknown 15494 1726853357.83955: variable 'ansible_module_compression' from source: unknown 15494 1726853357.83957: variable 'ansible_shell_type' from source: unknown 15494 1726853357.83959: variable 'ansible_shell_executable' from source: unknown 15494 1726853357.83962: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853357.83964: variable 'ansible_pipelining' from source: unknown 15494 1726853357.83966: variable 'ansible_timeout' from source: unknown 15494 1726853357.83968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853357.84014: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853357.84056: variable 'omit' from source: magic vars 15494 1726853357.84059: starting attempt loop 15494 1726853357.84062: running the handler 15494 1726853357.84064: _low_level_execute_command(): starting 15494 1726853357.84066: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853357.84654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853357.84663: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853357.84713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853357.84716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853357.84718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15494 1726853357.84722: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853357.84725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853357.84858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853357.84910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853357.86625: stdout chunk (state=3): >>>/root <<< 15494 1726853357.86797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853357.86801: stdout chunk (state=3): >>><<< 15494 1726853357.86803: stderr chunk (state=3): >>><<< 15494 1726853357.86878: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853357.86882: _low_level_execute_command(): starting 15494 1726853357.86885: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853357.8682728-16665-54013489523623 `" && echo ansible-tmp-1726853357.8682728-16665-54013489523623="` echo /root/.ansible/tmp/ansible-tmp-1726853357.8682728-16665-54013489523623 `" ) && sleep 0' 15494 1726853357.88074: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853357.88159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853357.88248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853357.88251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853357.88274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853357.88277: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853357.88280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853357.88282: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853357.88284: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 15494 1726853357.88286: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15494 1726853357.88288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853357.88290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853357.88333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853357.88376: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853357.88405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853357.88490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853357.90368: stdout chunk (state=3): >>>ansible-tmp-1726853357.8682728-16665-54013489523623=/root/.ansible/tmp/ansible-tmp-1726853357.8682728-16665-54013489523623 <<< 15494 1726853357.90478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853357.90518: stderr chunk (state=3): >>><<< 15494 1726853357.90521: stdout chunk (state=3): >>><<< 15494 1726853357.90539: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853357.8682728-16665-54013489523623=/root/.ansible/tmp/ansible-tmp-1726853357.8682728-16665-54013489523623 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853357.90597: variable 'ansible_module_compression' from source: unknown 15494 1726853357.90655: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15494 1726853357.90687: variable 'ansible_facts' from source: unknown 15494 1726853357.90789: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853357.8682728-16665-54013489523623/AnsiballZ_command.py 15494 1726853357.91004: Sending initial data 15494 1726853357.91007: Sent initial data (155 bytes) 15494 1726853357.91786: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853357.91825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853357.91893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853357.93450: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853357.93488: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853357.93540: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp1r0dda9n /root/.ansible/tmp/ansible-tmp-1726853357.8682728-16665-54013489523623/AnsiballZ_command.py <<< 15494 1726853357.93547: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853357.8682728-16665-54013489523623/AnsiballZ_command.py" <<< 15494 1726853357.93590: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp1r0dda9n" to remote "/root/.ansible/tmp/ansible-tmp-1726853357.8682728-16665-54013489523623/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853357.8682728-16665-54013489523623/AnsiballZ_command.py" <<< 15494 1726853357.94221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853357.94280: stderr chunk (state=3): >>><<< 15494 1726853357.94283: stdout chunk (state=3): >>><<< 15494 1726853357.94334: done transferring module to remote 15494 1726853357.94349: _low_level_execute_command(): starting 15494 1726853357.94362: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853357.8682728-16665-54013489523623/ /root/.ansible/tmp/ansible-tmp-1726853357.8682728-16665-54013489523623/AnsiballZ_command.py && sleep 0' 15494 1726853357.94919: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853357.94957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853357.94960: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853357.94966: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 15494 1726853357.94968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853357.94970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853357.95059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853357.95106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853357.97077: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853357.97081: stdout chunk (state=3): >>><<< 15494 1726853357.97083: stderr chunk (state=3): >>><<< 15494 1726853357.97097: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853357.97100: _low_level_execute_command(): starting 15494 1726853357.97102: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853357.8682728-16665-54013489523623/AnsiballZ_command.py && sleep 0' 15494 1726853357.98092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853357.98214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853357.98305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853358.14274: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-20 13:29:18.134065", "end": "2024-09-20 13:29:18.141849", "delta": "0:00:00.007784", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15494 1726853358.15998: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.45.153 closed. <<< 15494 1726853358.16002: stdout chunk (state=3): >>><<< 15494 1726853358.16004: stderr chunk (state=3): >>><<< 15494 1726853358.16008: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-20 13:29:18.134065", "end": "2024-09-20 13:29:18.141849", "delta": "0:00:00.007784", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.45.153 closed. 15494 1726853358.16179: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853357.8682728-16665-54013489523623/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853358.16183: _low_level_execute_command(): starting 15494 1726853358.16185: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853357.8682728-16665-54013489523623/ > /dev/null 2>&1 && sleep 0' 15494 1726853358.17312: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853358.17343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853358.17367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853358.17454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853358.17485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853358.17489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853358.17495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853358.17562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853358.19454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853358.19486: stderr chunk (state=3): >>><<< 15494 1726853358.19525: stdout chunk (state=3): >>><<< 15494 1726853358.19705: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853358.19709: handler run complete 15494 1726853358.19711: Evaluated conditional (False): False 15494 1726853358.19713: attempt loop complete, returning result 15494 1726853358.19716: _execute() done 15494 1726853358.19718: dumping result to json 15494 1726853358.19720: done dumping result, returning 15494 1726853358.19722: done running TaskExecutor() for managed_node1/TASK: Remove test interface if necessary [02083763-bbaf-0028-1a50-000000000393] 15494 1726853358.19770: sending task result for task 02083763-bbaf-0028-1a50-000000000393 15494 1726853358.20188: done sending task result for task 02083763-bbaf-0028-1a50-000000000393 fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "LSR-TST-br31" ], "delta": "0:00:00.007784", "end": "2024-09-20 13:29:18.141849", "rc": 1, "start": "2024-09-20 13:29:18.134065" } STDERR: Cannot find device "LSR-TST-br31" MSG: non-zero return code ...ignoring 15494 1726853358.20437: no more pending results, returning what we have 15494 1726853358.20442: results queue empty 15494 1726853358.20444: checking for any_errors_fatal 15494 1726853358.20446: done checking for any_errors_fatal 15494 1726853358.20446: checking for max_fail_percentage 15494 1726853358.20448: done checking for max_fail_percentage 15494 1726853358.20449: checking to see if all hosts have failed and the running result is not ok 15494 1726853358.20450: done checking to see if all hosts have failed 15494 1726853358.20451: getting the remaining hosts for this loop 15494 1726853358.20452: done getting the remaining hosts for this loop 15494 1726853358.20456: getting the next task for host managed_node1 15494 1726853358.20469: done getting next task for host managed_node1 15494 1726853358.20474: ^ task is: TASK: meta (flush_handlers) 15494 1726853358.20477: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853358.20483: getting variables 15494 1726853358.20485: in VariableManager get_vars() 15494 1726853358.20522: Calling all_inventory to load vars for managed_node1 15494 1726853358.20525: Calling groups_inventory to load vars for managed_node1 15494 1726853358.20529: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853358.20545: Calling all_plugins_play to load vars for managed_node1 15494 1726853358.20549: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853358.20555: Calling groups_plugins_play to load vars for managed_node1 15494 1726853358.21247: WORKER PROCESS EXITING 15494 1726853358.23983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853358.25969: done with get_vars() 15494 1726853358.26005: done getting variables 15494 1726853358.26077: in VariableManager get_vars() 15494 1726853358.26087: Calling all_inventory to load vars for managed_node1 15494 1726853358.26098: Calling groups_inventory to load vars for managed_node1 15494 1726853358.26101: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853358.26106: Calling all_plugins_play to load vars for managed_node1 15494 1726853358.26108: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853358.26111: Calling groups_plugins_play to load vars for managed_node1 15494 1726853358.28034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853358.30354: done with get_vars() 15494 1726853358.30388: done queuing things up, now waiting for results queue to drain 15494 1726853358.30390: results queue empty 15494 1726853358.30391: checking for any_errors_fatal 15494 1726853358.30394: done checking for any_errors_fatal 15494 1726853358.30395: checking for max_fail_percentage 15494 1726853358.30396: done checking for max_fail_percentage 15494 1726853358.30397: checking to see if all hosts have failed and the running result is not ok 15494 1726853358.30397: done checking to see if all hosts have failed 15494 1726853358.30398: getting the remaining hosts for this loop 15494 1726853358.30399: done getting the remaining hosts for this loop 15494 1726853358.30402: getting the next task for host managed_node1 15494 1726853358.30406: done getting next task for host managed_node1 15494 1726853358.30407: ^ task is: TASK: meta (flush_handlers) 15494 1726853358.30409: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853358.30412: getting variables 15494 1726853358.30412: in VariableManager get_vars() 15494 1726853358.30421: Calling all_inventory to load vars for managed_node1 15494 1726853358.30423: Calling groups_inventory to load vars for managed_node1 15494 1726853358.30425: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853358.30430: Calling all_plugins_play to load vars for managed_node1 15494 1726853358.30432: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853358.30438: Calling groups_plugins_play to load vars for managed_node1 15494 1726853358.31320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853358.32309: done with get_vars() 15494 1726853358.32326: done getting variables 15494 1726853358.32362: in VariableManager get_vars() 15494 1726853358.32369: Calling all_inventory to load vars for managed_node1 15494 1726853358.32372: Calling groups_inventory to load vars for managed_node1 15494 1726853358.32374: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853358.32378: Calling all_plugins_play to load vars for managed_node1 15494 1726853358.32379: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853358.32381: Calling groups_plugins_play to load vars for managed_node1 15494 1726853358.33068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853358.34341: done with get_vars() 15494 1726853358.34367: done queuing things up, now waiting for results queue to drain 15494 1726853358.34369: results queue empty 15494 1726853358.34376: checking for any_errors_fatal 15494 1726853358.34379: done checking for any_errors_fatal 15494 1726853358.34380: checking for max_fail_percentage 15494 1726853358.34381: done checking for max_fail_percentage 15494 1726853358.34381: checking to see if all hosts have failed and the running result is not ok 15494 1726853358.34382: done checking to see if all hosts have failed 15494 1726853358.34383: getting the remaining hosts for this loop 15494 1726853358.34384: done getting the remaining hosts for this loop 15494 1726853358.34387: getting the next task for host managed_node1 15494 1726853358.34390: done getting next task for host managed_node1 15494 1726853358.34391: ^ task is: None 15494 1726853358.34392: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853358.34392: done queuing things up, now waiting for results queue to drain 15494 1726853358.34393: results queue empty 15494 1726853358.34394: checking for any_errors_fatal 15494 1726853358.34394: done checking for any_errors_fatal 15494 1726853358.34394: checking for max_fail_percentage 15494 1726853358.34395: done checking for max_fail_percentage 15494 1726853358.34395: checking to see if all hosts have failed and the running result is not ok 15494 1726853358.34396: done checking to see if all hosts have failed 15494 1726853358.34397: getting the next task for host managed_node1 15494 1726853358.34400: done getting next task for host managed_node1 15494 1726853358.34400: ^ task is: None 15494 1726853358.34401: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853358.34439: in VariableManager get_vars() 15494 1726853358.34462: done with get_vars() 15494 1726853358.34474: in VariableManager get_vars() 15494 1726853358.34483: done with get_vars() 15494 1726853358.34486: variable 'omit' from source: magic vars 15494 1726853358.34591: variable 'profile' from source: play vars 15494 1726853358.34699: in VariableManager get_vars() 15494 1726853358.34715: done with get_vars() 15494 1726853358.34736: variable 'omit' from source: magic vars 15494 1726853358.34795: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 15494 1726853358.35555: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15494 1726853358.35578: getting the remaining hosts for this loop 15494 1726853358.35579: done getting the remaining hosts for this loop 15494 1726853358.35584: getting the next task for host managed_node1 15494 1726853358.35586: done getting next task for host managed_node1 15494 1726853358.35588: ^ task is: TASK: Gathering Facts 15494 1726853358.35590: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853358.35592: getting variables 15494 1726853358.35593: in VariableManager get_vars() 15494 1726853358.35607: Calling all_inventory to load vars for managed_node1 15494 1726853358.35609: Calling groups_inventory to load vars for managed_node1 15494 1726853358.35611: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853358.35616: Calling all_plugins_play to load vars for managed_node1 15494 1726853358.35619: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853358.35622: Calling groups_plugins_play to load vars for managed_node1 15494 1726853358.36644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853358.37679: done with get_vars() 15494 1726853358.37692: done getting variables 15494 1726853358.37739: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 13:29:18 -0400 (0:00:00.556) 0:00:26.993 ****** 15494 1726853358.37782: entering _queue_task() for managed_node1/gather_facts 15494 1726853358.38077: worker is 1 (out of 1 available) 15494 1726853358.38088: exiting _queue_task() for managed_node1/gather_facts 15494 1726853358.38102: done queuing things up, now waiting for results queue to drain 15494 1726853358.38103: waiting for pending results... 15494 1726853358.38309: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15494 1726853358.38372: in run() - task 02083763-bbaf-0028-1a50-0000000003a1 15494 1726853358.38384: variable 'ansible_search_path' from source: unknown 15494 1726853358.38515: calling self._execute() 15494 1726853358.38534: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853358.38548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853358.38563: variable 'omit' from source: magic vars 15494 1726853358.39020: variable 'ansible_distribution_major_version' from source: facts 15494 1726853358.39044: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853358.39190: variable 'omit' from source: magic vars 15494 1726853358.39201: variable 'omit' from source: magic vars 15494 1726853358.39204: variable 'omit' from source: magic vars 15494 1726853358.39213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853358.39243: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853358.39253: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853358.39267: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853358.39278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853358.39306: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853358.39312: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853358.39320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853358.39381: Set connection var ansible_connection to ssh 15494 1726853358.39387: Set connection var ansible_pipelining to False 15494 1726853358.39392: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853358.39395: Set connection var ansible_shell_type to sh 15494 1726853358.39400: Set connection var ansible_timeout to 10 15494 1726853358.39406: Set connection var ansible_shell_executable to /bin/sh 15494 1726853358.39431: variable 'ansible_shell_executable' from source: unknown 15494 1726853358.39434: variable 'ansible_connection' from source: unknown 15494 1726853358.39437: variable 'ansible_module_compression' from source: unknown 15494 1726853358.39439: variable 'ansible_shell_type' from source: unknown 15494 1726853358.39441: variable 'ansible_shell_executable' from source: unknown 15494 1726853358.39443: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853358.39445: variable 'ansible_pipelining' from source: unknown 15494 1726853358.39448: variable 'ansible_timeout' from source: unknown 15494 1726853358.39449: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853358.39777: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853358.39781: variable 'omit' from source: magic vars 15494 1726853358.39783: starting attempt loop 15494 1726853358.39785: running the handler 15494 1726853358.39787: variable 'ansible_facts' from source: unknown 15494 1726853358.39789: _low_level_execute_command(): starting 15494 1726853358.39790: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853358.40402: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853358.40595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853358.40714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853358.40811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853358.40834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853358.40956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853358.42630: stdout chunk (state=3): >>>/root <<< 15494 1726853358.42726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853358.42895: stderr chunk (state=3): >>><<< 15494 1726853358.42898: stdout chunk (state=3): >>><<< 15494 1726853358.42916: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853358.42934: _low_level_execute_command(): starting 15494 1726853358.42945: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853358.4292326-16690-240041005216803 `" && echo ansible-tmp-1726853358.4292326-16690-240041005216803="` echo /root/.ansible/tmp/ansible-tmp-1726853358.4292326-16690-240041005216803 `" ) && sleep 0' 15494 1726853358.43500: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853358.43514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853358.43528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853358.43546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853358.43561: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853358.43655: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853358.43677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853358.43746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853358.45618: stdout chunk (state=3): >>>ansible-tmp-1726853358.4292326-16690-240041005216803=/root/.ansible/tmp/ansible-tmp-1726853358.4292326-16690-240041005216803 <<< 15494 1726853358.45814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853358.45835: stdout chunk (state=3): >>><<< 15494 1726853358.45849: stderr chunk (state=3): >>><<< 15494 1726853358.45876: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853358.4292326-16690-240041005216803=/root/.ansible/tmp/ansible-tmp-1726853358.4292326-16690-240041005216803 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853358.45916: variable 'ansible_module_compression' from source: unknown 15494 1726853358.45979: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15494 1726853358.46100: variable 'ansible_facts' from source: unknown 15494 1726853358.46332: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853358.4292326-16690-240041005216803/AnsiballZ_setup.py 15494 1726853358.46493: Sending initial data 15494 1726853358.46587: Sent initial data (154 bytes) 15494 1726853358.47237: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853358.47261: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853358.47298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853358.47337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853358.47366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853358.47444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853358.47535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853358.47608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853358.49132: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15494 1726853358.49146: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853358.49297: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853358.49352: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpuf2kkbk1 /root/.ansible/tmp/ansible-tmp-1726853358.4292326-16690-240041005216803/AnsiballZ_setup.py <<< 15494 1726853358.49418: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853358.4292326-16690-240041005216803/AnsiballZ_setup.py" <<< 15494 1726853358.49467: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpuf2kkbk1" to remote "/root/.ansible/tmp/ansible-tmp-1726853358.4292326-16690-240041005216803/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853358.4292326-16690-240041005216803/AnsiballZ_setup.py" <<< 15494 1726853358.51384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853358.51387: stdout chunk (state=3): >>><<< 15494 1726853358.51390: stderr chunk (state=3): >>><<< 15494 1726853358.51392: done transferring module to remote 15494 1726853358.51405: _low_level_execute_command(): starting 15494 1726853358.51557: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853358.4292326-16690-240041005216803/ /root/.ansible/tmp/ansible-tmp-1726853358.4292326-16690-240041005216803/AnsiballZ_setup.py && sleep 0' 15494 1726853358.52681: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853358.52753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853358.54594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853358.54677: stderr chunk (state=3): >>><<< 15494 1726853358.54687: stdout chunk (state=3): >>><<< 15494 1726853358.54725: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853358.54738: _low_level_execute_command(): starting 15494 1726853358.54741: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853358.4292326-16690-240041005216803/AnsiballZ_setup.py && sleep 0' 15494 1726853358.55970: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853358.56045: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853358.56062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853358.56080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853358.56238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853360.19665: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 526, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797081088, "block_size": 4096, "block_total": 65519099, "block_available": 63915303, "block_used": 1603796, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "20", "epoch": "1726853360", "epoch_int": "1726853360", "date": "2024-09-20", "time": "13:29:20", "iso8601_micro": "2024-09-20T17:29:20.192930Z", "iso8601": "2024-09-20T17:29:20Z", "iso8601_basic": "20240920T132920192930", "iso8601_basic_short": "20240920T132920", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.56201171875, "5m": 0.361328125, "15m": 0.16015625}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15494 1726853360.21645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853360.21696: stderr chunk (state=3): >>><<< 15494 1726853360.21700: stdout chunk (state=3): >>><<< 15494 1726853360.21880: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 526, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797081088, "block_size": 4096, "block_total": 65519099, "block_available": 63915303, "block_used": 1603796, "inode_total": 131070960, "inode_available": 131029066, "inode_used": 41894, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "20", "epoch": "1726853360", "epoch_int": "1726853360", "date": "2024-09-20", "time": "13:29:20", "iso8601_micro": "2024-09-20T17:29:20.192930Z", "iso8601": "2024-09-20T17:29:20Z", "iso8601_basic": "20240920T132920192930", "iso8601_basic_short": "20240920T132920", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.56201171875, "5m": 0.361328125, "15m": 0.16015625}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853360.22154: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853358.4292326-16690-240041005216803/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853360.22187: _low_level_execute_command(): starting 15494 1726853360.22197: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853358.4292326-16690-240041005216803/ > /dev/null 2>&1 && sleep 0' 15494 1726853360.22789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853360.22803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853360.22817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853360.22837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853360.22854: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853360.22866: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853360.22884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853360.22979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853360.23192: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853360.23246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853360.25095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853360.25143: stderr chunk (state=3): >>><<< 15494 1726853360.25153: stdout chunk (state=3): >>><<< 15494 1726853360.25172: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853360.25185: handler run complete 15494 1726853360.25319: variable 'ansible_facts' from source: unknown 15494 1726853360.25432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853360.25742: variable 'ansible_facts' from source: unknown 15494 1726853360.25831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853360.25968: attempt loop complete, returning result 15494 1726853360.25981: _execute() done 15494 1726853360.25990: dumping result to json 15494 1726853360.26024: done dumping result, returning 15494 1726853360.26037: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-0028-1a50-0000000003a1] 15494 1726853360.26048: sending task result for task 02083763-bbaf-0028-1a50-0000000003a1 ok: [managed_node1] 15494 1726853360.26994: no more pending results, returning what we have 15494 1726853360.26997: results queue empty 15494 1726853360.26998: checking for any_errors_fatal 15494 1726853360.27000: done checking for any_errors_fatal 15494 1726853360.27000: checking for max_fail_percentage 15494 1726853360.27002: done checking for max_fail_percentage 15494 1726853360.27003: checking to see if all hosts have failed and the running result is not ok 15494 1726853360.27004: done checking to see if all hosts have failed 15494 1726853360.27004: getting the remaining hosts for this loop 15494 1726853360.27005: done getting the remaining hosts for this loop 15494 1726853360.27009: getting the next task for host managed_node1 15494 1726853360.27013: done getting next task for host managed_node1 15494 1726853360.27015: ^ task is: TASK: meta (flush_handlers) 15494 1726853360.27017: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853360.27020: getting variables 15494 1726853360.27022: in VariableManager get_vars() 15494 1726853360.27052: Calling all_inventory to load vars for managed_node1 15494 1726853360.27054: Calling groups_inventory to load vars for managed_node1 15494 1726853360.27056: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853360.27063: done sending task result for task 02083763-bbaf-0028-1a50-0000000003a1 15494 1726853360.27066: WORKER PROCESS EXITING 15494 1726853360.27077: Calling all_plugins_play to load vars for managed_node1 15494 1726853360.27080: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853360.27084: Calling groups_plugins_play to load vars for managed_node1 15494 1726853360.28299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853360.29842: done with get_vars() 15494 1726853360.29864: done getting variables 15494 1726853360.29932: in VariableManager get_vars() 15494 1726853360.29944: Calling all_inventory to load vars for managed_node1 15494 1726853360.29946: Calling groups_inventory to load vars for managed_node1 15494 1726853360.29948: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853360.29953: Calling all_plugins_play to load vars for managed_node1 15494 1726853360.29955: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853360.29958: Calling groups_plugins_play to load vars for managed_node1 15494 1726853360.31429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853360.33246: done with get_vars() 15494 1726853360.33287: done queuing things up, now waiting for results queue to drain 15494 1726853360.33289: results queue empty 15494 1726853360.33290: checking for any_errors_fatal 15494 1726853360.33299: done checking for any_errors_fatal 15494 1726853360.33300: checking for max_fail_percentage 15494 1726853360.33300: done checking for max_fail_percentage 15494 1726853360.33301: checking to see if all hosts have failed and the running result is not ok 15494 1726853360.33302: done checking to see if all hosts have failed 15494 1726853360.33302: getting the remaining hosts for this loop 15494 1726853360.33303: done getting the remaining hosts for this loop 15494 1726853360.33310: getting the next task for host managed_node1 15494 1726853360.33316: done getting next task for host managed_node1 15494 1726853360.33319: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15494 1726853360.33321: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853360.33331: getting variables 15494 1726853360.33332: in VariableManager get_vars() 15494 1726853360.33346: Calling all_inventory to load vars for managed_node1 15494 1726853360.33351: Calling groups_inventory to load vars for managed_node1 15494 1726853360.33353: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853360.33358: Calling all_plugins_play to load vars for managed_node1 15494 1726853360.33360: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853360.33363: Calling groups_plugins_play to load vars for managed_node1 15494 1726853360.34714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853360.37257: done with get_vars() 15494 1726853360.37286: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:29:20 -0400 (0:00:01.996) 0:00:28.989 ****** 15494 1726853360.37394: entering _queue_task() for managed_node1/include_tasks 15494 1726853360.37832: worker is 1 (out of 1 available) 15494 1726853360.37846: exiting _queue_task() for managed_node1/include_tasks 15494 1726853360.37978: done queuing things up, now waiting for results queue to drain 15494 1726853360.37980: waiting for pending results... 15494 1726853360.38192: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15494 1726853360.38324: in run() - task 02083763-bbaf-0028-1a50-00000000005a 15494 1726853360.38346: variable 'ansible_search_path' from source: unknown 15494 1726853360.38358: variable 'ansible_search_path' from source: unknown 15494 1726853360.38409: calling self._execute() 15494 1726853360.38527: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853360.38542: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853360.38563: variable 'omit' from source: magic vars 15494 1726853360.38989: variable 'ansible_distribution_major_version' from source: facts 15494 1726853360.39031: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853360.39042: _execute() done 15494 1726853360.39061: dumping result to json 15494 1726853360.39084: done dumping result, returning 15494 1726853360.39138: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-0028-1a50-00000000005a] 15494 1726853360.39141: sending task result for task 02083763-bbaf-0028-1a50-00000000005a 15494 1726853360.39217: done sending task result for task 02083763-bbaf-0028-1a50-00000000005a 15494 1726853360.39220: WORKER PROCESS EXITING 15494 1726853360.39266: no more pending results, returning what we have 15494 1726853360.39383: in VariableManager get_vars() 15494 1726853360.39430: Calling all_inventory to load vars for managed_node1 15494 1726853360.39434: Calling groups_inventory to load vars for managed_node1 15494 1726853360.39437: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853360.39452: Calling all_plugins_play to load vars for managed_node1 15494 1726853360.39456: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853360.39459: Calling groups_plugins_play to load vars for managed_node1 15494 1726853360.42193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853360.44070: done with get_vars() 15494 1726853360.44329: variable 'ansible_search_path' from source: unknown 15494 1726853360.44331: variable 'ansible_search_path' from source: unknown 15494 1726853360.44363: we have included files to process 15494 1726853360.44364: generating all_blocks data 15494 1726853360.44366: done generating all_blocks data 15494 1726853360.44367: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15494 1726853360.44368: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15494 1726853360.44370: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15494 1726853360.45211: done processing included file 15494 1726853360.45213: iterating over new_blocks loaded from include file 15494 1726853360.45214: in VariableManager get_vars() 15494 1726853360.45236: done with get_vars() 15494 1726853360.45238: filtering new block on tags 15494 1726853360.45257: done filtering new block on tags 15494 1726853360.45260: in VariableManager get_vars() 15494 1726853360.45279: done with get_vars() 15494 1726853360.45281: filtering new block on tags 15494 1726853360.45297: done filtering new block on tags 15494 1726853360.45299: in VariableManager get_vars() 15494 1726853360.45316: done with get_vars() 15494 1726853360.45317: filtering new block on tags 15494 1726853360.45330: done filtering new block on tags 15494 1726853360.45331: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 15494 1726853360.45336: extending task lists for all hosts with included blocks 15494 1726853360.45677: done extending task lists 15494 1726853360.45678: done processing included files 15494 1726853360.45679: results queue empty 15494 1726853360.45680: checking for any_errors_fatal 15494 1726853360.45681: done checking for any_errors_fatal 15494 1726853360.45682: checking for max_fail_percentage 15494 1726853360.45683: done checking for max_fail_percentage 15494 1726853360.45684: checking to see if all hosts have failed and the running result is not ok 15494 1726853360.45685: done checking to see if all hosts have failed 15494 1726853360.45685: getting the remaining hosts for this loop 15494 1726853360.45687: done getting the remaining hosts for this loop 15494 1726853360.45689: getting the next task for host managed_node1 15494 1726853360.45693: done getting next task for host managed_node1 15494 1726853360.45695: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15494 1726853360.45698: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853360.45707: getting variables 15494 1726853360.45708: in VariableManager get_vars() 15494 1726853360.45722: Calling all_inventory to load vars for managed_node1 15494 1726853360.45725: Calling groups_inventory to load vars for managed_node1 15494 1726853360.45727: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853360.45732: Calling all_plugins_play to load vars for managed_node1 15494 1726853360.45734: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853360.45737: Calling groups_plugins_play to load vars for managed_node1 15494 1726853360.47056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853360.48594: done with get_vars() 15494 1726853360.48622: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:29:20 -0400 (0:00:00.113) 0:00:29.102 ****** 15494 1726853360.48704: entering _queue_task() for managed_node1/setup 15494 1726853360.49054: worker is 1 (out of 1 available) 15494 1726853360.49067: exiting _queue_task() for managed_node1/setup 15494 1726853360.49281: done queuing things up, now waiting for results queue to drain 15494 1726853360.49282: waiting for pending results... 15494 1726853360.49488: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15494 1726853360.49495: in run() - task 02083763-bbaf-0028-1a50-0000000003e2 15494 1726853360.49516: variable 'ansible_search_path' from source: unknown 15494 1726853360.49522: variable 'ansible_search_path' from source: unknown 15494 1726853360.49562: calling self._execute() 15494 1726853360.49659: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853360.49668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853360.49683: variable 'omit' from source: magic vars 15494 1726853360.50461: variable 'ansible_distribution_major_version' from source: facts 15494 1726853360.50569: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853360.50809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853360.54175: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853360.54244: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853360.54296: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853360.54334: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853360.54374: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853360.54459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853360.54500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853360.54530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853360.54582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853360.54605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853360.54665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853360.54699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853360.54723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853360.54761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853360.54778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853360.54934: variable '__network_required_facts' from source: role '' defaults 15494 1726853360.54952: variable 'ansible_facts' from source: unknown 15494 1726853360.55715: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15494 1726853360.55722: when evaluation is False, skipping this task 15494 1726853360.55728: _execute() done 15494 1726853360.55777: dumping result to json 15494 1726853360.55780: done dumping result, returning 15494 1726853360.55782: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-0028-1a50-0000000003e2] 15494 1726853360.55784: sending task result for task 02083763-bbaf-0028-1a50-0000000003e2 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15494 1726853360.55924: no more pending results, returning what we have 15494 1726853360.55929: results queue empty 15494 1726853360.55930: checking for any_errors_fatal 15494 1726853360.55931: done checking for any_errors_fatal 15494 1726853360.55932: checking for max_fail_percentage 15494 1726853360.55933: done checking for max_fail_percentage 15494 1726853360.55934: checking to see if all hosts have failed and the running result is not ok 15494 1726853360.55935: done checking to see if all hosts have failed 15494 1726853360.55936: getting the remaining hosts for this loop 15494 1726853360.55938: done getting the remaining hosts for this loop 15494 1726853360.55941: getting the next task for host managed_node1 15494 1726853360.55953: done getting next task for host managed_node1 15494 1726853360.55957: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15494 1726853360.55960: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853360.55976: getting variables 15494 1726853360.55978: in VariableManager get_vars() 15494 1726853360.56019: Calling all_inventory to load vars for managed_node1 15494 1726853360.56021: Calling groups_inventory to load vars for managed_node1 15494 1726853360.56024: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853360.56034: Calling all_plugins_play to load vars for managed_node1 15494 1726853360.56037: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853360.56040: Calling groups_plugins_play to load vars for managed_node1 15494 1726853360.56978: done sending task result for task 02083763-bbaf-0028-1a50-0000000003e2 15494 1726853360.56982: WORKER PROCESS EXITING 15494 1726853360.58572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853360.60944: done with get_vars() 15494 1726853360.60975: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:29:20 -0400 (0:00:00.123) 0:00:29.226 ****** 15494 1726853360.61084: entering _queue_task() for managed_node1/stat 15494 1726853360.61435: worker is 1 (out of 1 available) 15494 1726853360.61449: exiting _queue_task() for managed_node1/stat 15494 1726853360.61462: done queuing things up, now waiting for results queue to drain 15494 1726853360.61463: waiting for pending results... 15494 1726853360.61765: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 15494 1726853360.61932: in run() - task 02083763-bbaf-0028-1a50-0000000003e4 15494 1726853360.61955: variable 'ansible_search_path' from source: unknown 15494 1726853360.61963: variable 'ansible_search_path' from source: unknown 15494 1726853360.62010: calling self._execute() 15494 1726853360.62121: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853360.62143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853360.62160: variable 'omit' from source: magic vars 15494 1726853360.62534: variable 'ansible_distribution_major_version' from source: facts 15494 1726853360.62551: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853360.62785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853360.63006: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853360.63056: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853360.63097: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853360.63145: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853360.63242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853360.63274: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853360.63305: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853360.63339: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853360.63421: variable '__network_is_ostree' from source: set_fact 15494 1726853360.63677: Evaluated conditional (not __network_is_ostree is defined): False 15494 1726853360.63680: when evaluation is False, skipping this task 15494 1726853360.63683: _execute() done 15494 1726853360.63685: dumping result to json 15494 1726853360.63688: done dumping result, returning 15494 1726853360.63690: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-0028-1a50-0000000003e4] 15494 1726853360.63693: sending task result for task 02083763-bbaf-0028-1a50-0000000003e4 15494 1726853360.63757: done sending task result for task 02083763-bbaf-0028-1a50-0000000003e4 15494 1726853360.63760: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15494 1726853360.63811: no more pending results, returning what we have 15494 1726853360.63815: results queue empty 15494 1726853360.63816: checking for any_errors_fatal 15494 1726853360.63820: done checking for any_errors_fatal 15494 1726853360.63821: checking for max_fail_percentage 15494 1726853360.63822: done checking for max_fail_percentage 15494 1726853360.63823: checking to see if all hosts have failed and the running result is not ok 15494 1726853360.63825: done checking to see if all hosts have failed 15494 1726853360.63826: getting the remaining hosts for this loop 15494 1726853360.63828: done getting the remaining hosts for this loop 15494 1726853360.63831: getting the next task for host managed_node1 15494 1726853360.63837: done getting next task for host managed_node1 15494 1726853360.63842: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15494 1726853360.63845: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853360.63859: getting variables 15494 1726853360.63860: in VariableManager get_vars() 15494 1726853360.63900: Calling all_inventory to load vars for managed_node1 15494 1726853360.63903: Calling groups_inventory to load vars for managed_node1 15494 1726853360.63905: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853360.63916: Calling all_plugins_play to load vars for managed_node1 15494 1726853360.63919: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853360.63922: Calling groups_plugins_play to load vars for managed_node1 15494 1726853360.65438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853360.67149: done with get_vars() 15494 1726853360.67178: done getting variables 15494 1726853360.67238: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:29:20 -0400 (0:00:00.061) 0:00:29.288 ****** 15494 1726853360.67270: entering _queue_task() for managed_node1/set_fact 15494 1726853360.67594: worker is 1 (out of 1 available) 15494 1726853360.67606: exiting _queue_task() for managed_node1/set_fact 15494 1726853360.67618: done queuing things up, now waiting for results queue to drain 15494 1726853360.67619: waiting for pending results... 15494 1726853360.67895: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15494 1726853360.68037: in run() - task 02083763-bbaf-0028-1a50-0000000003e5 15494 1726853360.68060: variable 'ansible_search_path' from source: unknown 15494 1726853360.68068: variable 'ansible_search_path' from source: unknown 15494 1726853360.68120: calling self._execute() 15494 1726853360.68231: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853360.68276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853360.68279: variable 'omit' from source: magic vars 15494 1726853360.68613: variable 'ansible_distribution_major_version' from source: facts 15494 1726853360.68630: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853360.68792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853360.69048: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853360.69276: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853360.69279: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853360.69282: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853360.69284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853360.69286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853360.69310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853360.69341: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853360.69437: variable '__network_is_ostree' from source: set_fact 15494 1726853360.69450: Evaluated conditional (not __network_is_ostree is defined): False 15494 1726853360.69457: when evaluation is False, skipping this task 15494 1726853360.69464: _execute() done 15494 1726853360.69469: dumping result to json 15494 1726853360.69479: done dumping result, returning 15494 1726853360.69489: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-0028-1a50-0000000003e5] 15494 1726853360.69497: sending task result for task 02083763-bbaf-0028-1a50-0000000003e5 15494 1726853360.69599: done sending task result for task 02083763-bbaf-0028-1a50-0000000003e5 15494 1726853360.69605: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15494 1726853360.69661: no more pending results, returning what we have 15494 1726853360.69666: results queue empty 15494 1726853360.69666: checking for any_errors_fatal 15494 1726853360.69674: done checking for any_errors_fatal 15494 1726853360.69675: checking for max_fail_percentage 15494 1726853360.69677: done checking for max_fail_percentage 15494 1726853360.69677: checking to see if all hosts have failed and the running result is not ok 15494 1726853360.69678: done checking to see if all hosts have failed 15494 1726853360.69679: getting the remaining hosts for this loop 15494 1726853360.69681: done getting the remaining hosts for this loop 15494 1726853360.69685: getting the next task for host managed_node1 15494 1726853360.69694: done getting next task for host managed_node1 15494 1726853360.69698: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15494 1726853360.69701: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853360.69715: getting variables 15494 1726853360.69717: in VariableManager get_vars() 15494 1726853360.69759: Calling all_inventory to load vars for managed_node1 15494 1726853360.69762: Calling groups_inventory to load vars for managed_node1 15494 1726853360.69764: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853360.69981: Calling all_plugins_play to load vars for managed_node1 15494 1726853360.69985: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853360.69989: Calling groups_plugins_play to load vars for managed_node1 15494 1726853360.71480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853360.73043: done with get_vars() 15494 1726853360.73068: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:29:20 -0400 (0:00:00.058) 0:00:29.347 ****** 15494 1726853360.73165: entering _queue_task() for managed_node1/service_facts 15494 1726853360.73494: worker is 1 (out of 1 available) 15494 1726853360.73506: exiting _queue_task() for managed_node1/service_facts 15494 1726853360.73517: done queuing things up, now waiting for results queue to drain 15494 1726853360.73518: waiting for pending results... 15494 1726853360.73786: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 15494 1726853360.73915: in run() - task 02083763-bbaf-0028-1a50-0000000003e7 15494 1726853360.73936: variable 'ansible_search_path' from source: unknown 15494 1726853360.73976: variable 'ansible_search_path' from source: unknown 15494 1726853360.73991: calling self._execute() 15494 1726853360.74079: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853360.74093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853360.74106: variable 'omit' from source: magic vars 15494 1726853360.74459: variable 'ansible_distribution_major_version' from source: facts 15494 1726853360.74526: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853360.74529: variable 'omit' from source: magic vars 15494 1726853360.74537: variable 'omit' from source: magic vars 15494 1726853360.74570: variable 'omit' from source: magic vars 15494 1726853360.74613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853360.74654: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853360.74680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853360.74697: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853360.74711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853360.74747: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853360.74755: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853360.74761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853360.74960: Set connection var ansible_connection to ssh 15494 1726853360.74964: Set connection var ansible_pipelining to False 15494 1726853360.74966: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853360.74969: Set connection var ansible_shell_type to sh 15494 1726853360.74972: Set connection var ansible_timeout to 10 15494 1726853360.74975: Set connection var ansible_shell_executable to /bin/sh 15494 1726853360.74977: variable 'ansible_shell_executable' from source: unknown 15494 1726853360.74979: variable 'ansible_connection' from source: unknown 15494 1726853360.74982: variable 'ansible_module_compression' from source: unknown 15494 1726853360.74984: variable 'ansible_shell_type' from source: unknown 15494 1726853360.74986: variable 'ansible_shell_executable' from source: unknown 15494 1726853360.74988: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853360.74990: variable 'ansible_pipelining' from source: unknown 15494 1726853360.74992: variable 'ansible_timeout' from source: unknown 15494 1726853360.74994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853360.75163: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853360.75181: variable 'omit' from source: magic vars 15494 1726853360.75188: starting attempt loop 15494 1726853360.75193: running the handler 15494 1726853360.75208: _low_level_execute_command(): starting 15494 1726853360.75222: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853360.75919: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853360.75937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853360.75952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853360.75989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853360.76076: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853360.76098: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853360.76114: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853360.76198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853360.77875: stdout chunk (state=3): >>>/root <<< 15494 1726853360.77999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853360.78036: stderr chunk (state=3): >>><<< 15494 1726853360.78058: stdout chunk (state=3): >>><<< 15494 1726853360.78088: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853360.78115: _low_level_execute_command(): starting 15494 1726853360.78127: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853360.781005-16815-71908390085739 `" && echo ansible-tmp-1726853360.781005-16815-71908390085739="` echo /root/.ansible/tmp/ansible-tmp-1726853360.781005-16815-71908390085739 `" ) && sleep 0' 15494 1726853360.78794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853360.78811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853360.78829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853360.78919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853360.78966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853360.78987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853360.79010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853360.79083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853360.80996: stdout chunk (state=3): >>>ansible-tmp-1726853360.781005-16815-71908390085739=/root/.ansible/tmp/ansible-tmp-1726853360.781005-16815-71908390085739 <<< 15494 1726853360.81156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853360.81160: stdout chunk (state=3): >>><<< 15494 1726853360.81162: stderr chunk (state=3): >>><<< 15494 1726853360.81181: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853360.781005-16815-71908390085739=/root/.ansible/tmp/ansible-tmp-1726853360.781005-16815-71908390085739 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853360.81378: variable 'ansible_module_compression' from source: unknown 15494 1726853360.81381: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15494 1726853360.81383: variable 'ansible_facts' from source: unknown 15494 1726853360.81422: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853360.781005-16815-71908390085739/AnsiballZ_service_facts.py 15494 1726853360.81613: Sending initial data 15494 1726853360.81622: Sent initial data (160 bytes) 15494 1726853360.82256: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853360.82273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853360.82289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853360.82368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853360.82416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853360.82434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853360.82459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853360.82525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853360.84081: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853360.84143: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853360.84201: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp1rlu14fu /root/.ansible/tmp/ansible-tmp-1726853360.781005-16815-71908390085739/AnsiballZ_service_facts.py <<< 15494 1726853360.84207: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853360.781005-16815-71908390085739/AnsiballZ_service_facts.py" <<< 15494 1726853360.84241: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp1rlu14fu" to remote "/root/.ansible/tmp/ansible-tmp-1726853360.781005-16815-71908390085739/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853360.781005-16815-71908390085739/AnsiballZ_service_facts.py" <<< 15494 1726853360.85034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853360.85070: stderr chunk (state=3): >>><<< 15494 1726853360.85083: stdout chunk (state=3): >>><<< 15494 1726853360.85133: done transferring module to remote 15494 1726853360.85136: _low_level_execute_command(): starting 15494 1726853360.85138: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853360.781005-16815-71908390085739/ /root/.ansible/tmp/ansible-tmp-1726853360.781005-16815-71908390085739/AnsiballZ_service_facts.py && sleep 0' 15494 1726853360.85619: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853360.85644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15494 1726853360.85650: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853360.85653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853360.85703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853360.85707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853360.85750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853360.87599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853360.87602: stdout chunk (state=3): >>><<< 15494 1726853360.87604: stderr chunk (state=3): >>><<< 15494 1726853360.87606: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853360.87608: _low_level_execute_command(): starting 15494 1726853360.87610: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853360.781005-16815-71908390085739/AnsiballZ_service_facts.py && sleep 0' 15494 1726853360.88103: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853360.88106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853360.88113: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853360.88119: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853360.88121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853360.88174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853360.88181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853360.88183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853360.88223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853362.40528: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15494 1726853362.41915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853362.41938: stderr chunk (state=3): >>>Shared connection to 10.31.45.153 closed. <<< 15494 1726853362.42333: stderr chunk (state=3): >>><<< 15494 1726853362.42337: stdout chunk (state=3): >>><<< 15494 1726853362.42367: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853362.55865: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853360.781005-16815-71908390085739/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853362.55882: _low_level_execute_command(): starting 15494 1726853362.55885: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853360.781005-16815-71908390085739/ > /dev/null 2>&1 && sleep 0' 15494 1726853362.57081: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853362.57086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853362.57091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 15494 1726853362.57102: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853362.57155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853362.57158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853362.57418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853362.59284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853362.59330: stderr chunk (state=3): >>><<< 15494 1726853362.59333: stdout chunk (state=3): >>><<< 15494 1726853362.59345: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853362.59355: handler run complete 15494 1726853362.59768: variable 'ansible_facts' from source: unknown 15494 1726853362.60020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853362.61017: variable 'ansible_facts' from source: unknown 15494 1726853362.61408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853362.61647: attempt loop complete, returning result 15494 1726853362.61659: _execute() done 15494 1726853362.61666: dumping result to json 15494 1726853362.61743: done dumping result, returning 15494 1726853362.61757: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-0028-1a50-0000000003e7] 15494 1726853362.61766: sending task result for task 02083763-bbaf-0028-1a50-0000000003e7 15494 1726853362.67839: done sending task result for task 02083763-bbaf-0028-1a50-0000000003e7 15494 1726853362.67843: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15494 1726853362.67928: no more pending results, returning what we have 15494 1726853362.67930: results queue empty 15494 1726853362.67931: checking for any_errors_fatal 15494 1726853362.67933: done checking for any_errors_fatal 15494 1726853362.67934: checking for max_fail_percentage 15494 1726853362.67935: done checking for max_fail_percentage 15494 1726853362.67936: checking to see if all hosts have failed and the running result is not ok 15494 1726853362.67937: done checking to see if all hosts have failed 15494 1726853362.67937: getting the remaining hosts for this loop 15494 1726853362.67938: done getting the remaining hosts for this loop 15494 1726853362.67941: getting the next task for host managed_node1 15494 1726853362.67944: done getting next task for host managed_node1 15494 1726853362.67946: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15494 1726853362.67949: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853362.67956: getting variables 15494 1726853362.67957: in VariableManager get_vars() 15494 1726853362.67980: Calling all_inventory to load vars for managed_node1 15494 1726853362.67982: Calling groups_inventory to load vars for managed_node1 15494 1726853362.67984: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853362.67990: Calling all_plugins_play to load vars for managed_node1 15494 1726853362.67992: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853362.67994: Calling groups_plugins_play to load vars for managed_node1 15494 1726853362.70403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853362.72791: done with get_vars() 15494 1726853362.72822: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:29:22 -0400 (0:00:01.997) 0:00:31.345 ****** 15494 1726853362.72926: entering _queue_task() for managed_node1/package_facts 15494 1726853362.73445: worker is 1 (out of 1 available) 15494 1726853362.73459: exiting _queue_task() for managed_node1/package_facts 15494 1726853362.73469: done queuing things up, now waiting for results queue to drain 15494 1726853362.73473: waiting for pending results... 15494 1726853362.73713: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 15494 1726853362.73870: in run() - task 02083763-bbaf-0028-1a50-0000000003e8 15494 1726853362.73893: variable 'ansible_search_path' from source: unknown 15494 1726853362.73902: variable 'ansible_search_path' from source: unknown 15494 1726853362.73976: calling self._execute() 15494 1726853362.74076: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853362.74090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853362.74136: variable 'omit' from source: magic vars 15494 1726853362.74544: variable 'ansible_distribution_major_version' from source: facts 15494 1726853362.74560: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853362.74581: variable 'omit' from source: magic vars 15494 1726853362.74680: variable 'omit' from source: magic vars 15494 1726853362.74712: variable 'omit' from source: magic vars 15494 1726853362.74758: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853362.74894: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853362.74898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853362.74901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853362.74903: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853362.74932: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853362.74942: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853362.74949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853362.75069: Set connection var ansible_connection to ssh 15494 1726853362.75113: Set connection var ansible_pipelining to False 15494 1726853362.75117: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853362.75121: Set connection var ansible_shell_type to sh 15494 1726853362.75127: Set connection var ansible_timeout to 10 15494 1726853362.75130: Set connection var ansible_shell_executable to /bin/sh 15494 1726853362.75159: variable 'ansible_shell_executable' from source: unknown 15494 1726853362.75167: variable 'ansible_connection' from source: unknown 15494 1726853362.75220: variable 'ansible_module_compression' from source: unknown 15494 1726853362.75224: variable 'ansible_shell_type' from source: unknown 15494 1726853362.75227: variable 'ansible_shell_executable' from source: unknown 15494 1726853362.75229: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853362.75236: variable 'ansible_pipelining' from source: unknown 15494 1726853362.75238: variable 'ansible_timeout' from source: unknown 15494 1726853362.75240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853362.75582: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853362.75844: variable 'omit' from source: magic vars 15494 1726853362.75848: starting attempt loop 15494 1726853362.75851: running the handler 15494 1726853362.75853: _low_level_execute_command(): starting 15494 1726853362.75855: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853362.76924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853362.76963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853362.77121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853362.77124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853362.77126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853362.77300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853362.78991: stdout chunk (state=3): >>>/root <<< 15494 1726853362.79087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853362.79347: stderr chunk (state=3): >>><<< 15494 1726853362.79351: stdout chunk (state=3): >>><<< 15494 1726853362.79356: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853362.79358: _low_level_execute_command(): starting 15494 1726853362.79361: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853362.792538-16900-5685611071466 `" && echo ansible-tmp-1726853362.792538-16900-5685611071466="` echo /root/.ansible/tmp/ansible-tmp-1726853362.792538-16900-5685611071466 `" ) && sleep 0' 15494 1726853362.80555: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853362.80576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853362.80649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853362.80992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853362.82842: stdout chunk (state=3): >>>ansible-tmp-1726853362.792538-16900-5685611071466=/root/.ansible/tmp/ansible-tmp-1726853362.792538-16900-5685611071466 <<< 15494 1726853362.82944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853362.83049: stderr chunk (state=3): >>><<< 15494 1726853362.83067: stdout chunk (state=3): >>><<< 15494 1726853362.83085: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853362.792538-16900-5685611071466=/root/.ansible/tmp/ansible-tmp-1726853362.792538-16900-5685611071466 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853362.83164: variable 'ansible_module_compression' from source: unknown 15494 1726853362.83304: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15494 1726853362.83581: variable 'ansible_facts' from source: unknown 15494 1726853362.83906: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853362.792538-16900-5685611071466/AnsiballZ_package_facts.py 15494 1726853362.84198: Sending initial data 15494 1726853362.84202: Sent initial data (159 bytes) 15494 1726853362.85307: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853362.85311: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853362.85313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853362.85362: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853362.85401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853362.85675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853362.85693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853362.87261: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853362.87308: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853362.87405: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp0gjz1l01 /root/.ansible/tmp/ansible-tmp-1726853362.792538-16900-5685611071466/AnsiballZ_package_facts.py <<< 15494 1726853362.87419: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853362.792538-16900-5685611071466/AnsiballZ_package_facts.py" <<< 15494 1726853362.87531: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp0gjz1l01" to remote "/root/.ansible/tmp/ansible-tmp-1726853362.792538-16900-5685611071466/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853362.792538-16900-5685611071466/AnsiballZ_package_facts.py" <<< 15494 1726853362.90226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853362.90480: stderr chunk (state=3): >>><<< 15494 1726853362.90484: stdout chunk (state=3): >>><<< 15494 1726853362.90487: done transferring module to remote 15494 1726853362.90489: _low_level_execute_command(): starting 15494 1726853362.90492: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853362.792538-16900-5685611071466/ /root/.ansible/tmp/ansible-tmp-1726853362.792538-16900-5685611071466/AnsiballZ_package_facts.py && sleep 0' 15494 1726853362.91851: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853362.91874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853362.91882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853362.91918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853362.91958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853362.92024: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853362.92042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853362.92069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853362.92189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853362.94074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853362.94085: stdout chunk (state=3): >>><<< 15494 1726853362.94096: stderr chunk (state=3): >>><<< 15494 1726853362.94117: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853362.94170: _low_level_execute_command(): starting 15494 1726853362.94183: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853362.792538-16900-5685611071466/AnsiballZ_package_facts.py && sleep 0' 15494 1726853362.95233: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853362.95269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853362.95298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853362.95317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853362.95352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853362.95463: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853362.95768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853362.96031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853363.39828: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 15494 1726853363.39833: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 15494 1726853363.39853: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 15494 1726853363.39914: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 15494 1726853363.39926: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 15494 1726853363.39947: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 15494 1726853363.39981: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 15494 1726853363.39995: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 15494 1726853363.40029: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 15494 1726853363.40041: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15494 1726853363.41763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853363.41794: stderr chunk (state=3): >>><<< 15494 1726853363.41797: stdout chunk (state=3): >>><<< 15494 1726853363.41839: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853363.43219: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853362.792538-16900-5685611071466/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853363.43276: _low_level_execute_command(): starting 15494 1726853363.43279: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853362.792538-16900-5685611071466/ > /dev/null 2>&1 && sleep 0' 15494 1726853363.43846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853363.43849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853363.43852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853363.43854: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 15494 1726853363.43856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853363.43859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853363.43952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853363.44002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853363.45836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853363.45937: stderr chunk (state=3): >>><<< 15494 1726853363.45942: stdout chunk (state=3): >>><<< 15494 1726853363.45945: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853363.45947: handler run complete 15494 1726853363.46496: variable 'ansible_facts' from source: unknown 15494 1726853363.46818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853363.47901: variable 'ansible_facts' from source: unknown 15494 1726853363.48141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853363.48535: attempt loop complete, returning result 15494 1726853363.48542: _execute() done 15494 1726853363.48545: dumping result to json 15494 1726853363.48735: done dumping result, returning 15494 1726853363.48738: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-0028-1a50-0000000003e8] 15494 1726853363.48740: sending task result for task 02083763-bbaf-0028-1a50-0000000003e8 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15494 1726853363.50204: done sending task result for task 02083763-bbaf-0028-1a50-0000000003e8 15494 1726853363.50208: WORKER PROCESS EXITING 15494 1726853363.50214: no more pending results, returning what we have 15494 1726853363.50216: results queue empty 15494 1726853363.50217: checking for any_errors_fatal 15494 1726853363.50220: done checking for any_errors_fatal 15494 1726853363.50221: checking for max_fail_percentage 15494 1726853363.50224: done checking for max_fail_percentage 15494 1726853363.50225: checking to see if all hosts have failed and the running result is not ok 15494 1726853363.50226: done checking to see if all hosts have failed 15494 1726853363.50227: getting the remaining hosts for this loop 15494 1726853363.50228: done getting the remaining hosts for this loop 15494 1726853363.50231: getting the next task for host managed_node1 15494 1726853363.50236: done getting next task for host managed_node1 15494 1726853363.50238: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15494 1726853363.50240: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853363.50246: getting variables 15494 1726853363.50249: in VariableManager get_vars() 15494 1726853363.50283: Calling all_inventory to load vars for managed_node1 15494 1726853363.50285: Calling groups_inventory to load vars for managed_node1 15494 1726853363.50287: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853363.50294: Calling all_plugins_play to load vars for managed_node1 15494 1726853363.50295: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853363.50297: Calling groups_plugins_play to load vars for managed_node1 15494 1726853363.50983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853363.52086: done with get_vars() 15494 1726853363.52107: done getting variables 15494 1726853363.52154: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:29:23 -0400 (0:00:00.792) 0:00:32.137 ****** 15494 1726853363.52179: entering _queue_task() for managed_node1/debug 15494 1726853363.52444: worker is 1 (out of 1 available) 15494 1726853363.52459: exiting _queue_task() for managed_node1/debug 15494 1726853363.52473: done queuing things up, now waiting for results queue to drain 15494 1726853363.52475: waiting for pending results... 15494 1726853363.52707: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 15494 1726853363.52788: in run() - task 02083763-bbaf-0028-1a50-00000000005b 15494 1726853363.52803: variable 'ansible_search_path' from source: unknown 15494 1726853363.52807: variable 'ansible_search_path' from source: unknown 15494 1726853363.52836: calling self._execute() 15494 1726853363.52944: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853363.52951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853363.52964: variable 'omit' from source: magic vars 15494 1726853363.53301: variable 'ansible_distribution_major_version' from source: facts 15494 1726853363.53330: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853363.53334: variable 'omit' from source: magic vars 15494 1726853363.53355: variable 'omit' from source: magic vars 15494 1726853363.53435: variable 'network_provider' from source: set_fact 15494 1726853363.53453: variable 'omit' from source: magic vars 15494 1726853363.53505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853363.53529: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853363.53553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853363.53579: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853363.53591: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853363.53632: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853363.53636: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853363.53639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853363.53726: Set connection var ansible_connection to ssh 15494 1726853363.53730: Set connection var ansible_pipelining to False 15494 1726853363.53732: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853363.53735: Set connection var ansible_shell_type to sh 15494 1726853363.53737: Set connection var ansible_timeout to 10 15494 1726853363.53739: Set connection var ansible_shell_executable to /bin/sh 15494 1726853363.53750: variable 'ansible_shell_executable' from source: unknown 15494 1726853363.53753: variable 'ansible_connection' from source: unknown 15494 1726853363.53756: variable 'ansible_module_compression' from source: unknown 15494 1726853363.53758: variable 'ansible_shell_type' from source: unknown 15494 1726853363.53760: variable 'ansible_shell_executable' from source: unknown 15494 1726853363.53763: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853363.53765: variable 'ansible_pipelining' from source: unknown 15494 1726853363.53767: variable 'ansible_timeout' from source: unknown 15494 1726853363.53769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853363.53891: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853363.53923: variable 'omit' from source: magic vars 15494 1726853363.53928: starting attempt loop 15494 1726853363.53932: running the handler 15494 1726853363.53975: handler run complete 15494 1726853363.53984: attempt loop complete, returning result 15494 1726853363.53986: _execute() done 15494 1726853363.53989: dumping result to json 15494 1726853363.53992: done dumping result, returning 15494 1726853363.53999: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-0028-1a50-00000000005b] 15494 1726853363.54002: sending task result for task 02083763-bbaf-0028-1a50-00000000005b 15494 1726853363.54093: done sending task result for task 02083763-bbaf-0028-1a50-00000000005b 15494 1726853363.54095: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 15494 1726853363.54161: no more pending results, returning what we have 15494 1726853363.54165: results queue empty 15494 1726853363.54166: checking for any_errors_fatal 15494 1726853363.54177: done checking for any_errors_fatal 15494 1726853363.54177: checking for max_fail_percentage 15494 1726853363.54179: done checking for max_fail_percentage 15494 1726853363.54180: checking to see if all hosts have failed and the running result is not ok 15494 1726853363.54181: done checking to see if all hosts have failed 15494 1726853363.54182: getting the remaining hosts for this loop 15494 1726853363.54183: done getting the remaining hosts for this loop 15494 1726853363.54186: getting the next task for host managed_node1 15494 1726853363.54192: done getting next task for host managed_node1 15494 1726853363.54196: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15494 1726853363.54197: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853363.54205: getting variables 15494 1726853363.54207: in VariableManager get_vars() 15494 1726853363.54245: Calling all_inventory to load vars for managed_node1 15494 1726853363.54249: Calling groups_inventory to load vars for managed_node1 15494 1726853363.54251: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853363.54259: Calling all_plugins_play to load vars for managed_node1 15494 1726853363.54261: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853363.54264: Calling groups_plugins_play to load vars for managed_node1 15494 1726853363.55394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853363.56544: done with get_vars() 15494 1726853363.56562: done getting variables 15494 1726853363.56630: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:29:23 -0400 (0:00:00.044) 0:00:32.182 ****** 15494 1726853363.56663: entering _queue_task() for managed_node1/fail 15494 1726853363.56936: worker is 1 (out of 1 available) 15494 1726853363.56948: exiting _queue_task() for managed_node1/fail 15494 1726853363.56961: done queuing things up, now waiting for results queue to drain 15494 1726853363.56962: waiting for pending results... 15494 1726853363.57233: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15494 1726853363.57314: in run() - task 02083763-bbaf-0028-1a50-00000000005c 15494 1726853363.57329: variable 'ansible_search_path' from source: unknown 15494 1726853363.57333: variable 'ansible_search_path' from source: unknown 15494 1726853363.57388: calling self._execute() 15494 1726853363.57462: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853363.57466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853363.57477: variable 'omit' from source: magic vars 15494 1726853363.57855: variable 'ansible_distribution_major_version' from source: facts 15494 1726853363.57864: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853363.57994: variable 'network_state' from source: role '' defaults 15494 1726853363.57998: Evaluated conditional (network_state != {}): False 15494 1726853363.58001: when evaluation is False, skipping this task 15494 1726853363.58011: _execute() done 15494 1726853363.58020: dumping result to json 15494 1726853363.58023: done dumping result, returning 15494 1726853363.58026: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-0028-1a50-00000000005c] 15494 1726853363.58029: sending task result for task 02083763-bbaf-0028-1a50-00000000005c 15494 1726853363.58114: done sending task result for task 02083763-bbaf-0028-1a50-00000000005c 15494 1726853363.58116: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15494 1726853363.58161: no more pending results, returning what we have 15494 1726853363.58174: results queue empty 15494 1726853363.58175: checking for any_errors_fatal 15494 1726853363.58183: done checking for any_errors_fatal 15494 1726853363.58187: checking for max_fail_percentage 15494 1726853363.58189: done checking for max_fail_percentage 15494 1726853363.58190: checking to see if all hosts have failed and the running result is not ok 15494 1726853363.58191: done checking to see if all hosts have failed 15494 1726853363.58192: getting the remaining hosts for this loop 15494 1726853363.58193: done getting the remaining hosts for this loop 15494 1726853363.58197: getting the next task for host managed_node1 15494 1726853363.58206: done getting next task for host managed_node1 15494 1726853363.58210: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15494 1726853363.58213: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853363.58227: getting variables 15494 1726853363.58229: in VariableManager get_vars() 15494 1726853363.58263: Calling all_inventory to load vars for managed_node1 15494 1726853363.58265: Calling groups_inventory to load vars for managed_node1 15494 1726853363.58267: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853363.58278: Calling all_plugins_play to load vars for managed_node1 15494 1726853363.58280: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853363.58284: Calling groups_plugins_play to load vars for managed_node1 15494 1726853363.59081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853363.60087: done with get_vars() 15494 1726853363.60102: done getting variables 15494 1726853363.60158: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:29:23 -0400 (0:00:00.035) 0:00:32.217 ****** 15494 1726853363.60188: entering _queue_task() for managed_node1/fail 15494 1726853363.60488: worker is 1 (out of 1 available) 15494 1726853363.60501: exiting _queue_task() for managed_node1/fail 15494 1726853363.60514: done queuing things up, now waiting for results queue to drain 15494 1726853363.60515: waiting for pending results... 15494 1726853363.60728: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15494 1726853363.60789: in run() - task 02083763-bbaf-0028-1a50-00000000005d 15494 1726853363.60801: variable 'ansible_search_path' from source: unknown 15494 1726853363.60805: variable 'ansible_search_path' from source: unknown 15494 1726853363.60833: calling self._execute() 15494 1726853363.60917: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853363.60921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853363.60930: variable 'omit' from source: magic vars 15494 1726853363.61227: variable 'ansible_distribution_major_version' from source: facts 15494 1726853363.61236: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853363.61331: variable 'network_state' from source: role '' defaults 15494 1726853363.61340: Evaluated conditional (network_state != {}): False 15494 1726853363.61343: when evaluation is False, skipping this task 15494 1726853363.61346: _execute() done 15494 1726853363.61351: dumping result to json 15494 1726853363.61354: done dumping result, returning 15494 1726853363.61358: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-0028-1a50-00000000005d] 15494 1726853363.61374: sending task result for task 02083763-bbaf-0028-1a50-00000000005d 15494 1726853363.61515: done sending task result for task 02083763-bbaf-0028-1a50-00000000005d 15494 1726853363.61518: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15494 1726853363.61610: no more pending results, returning what we have 15494 1726853363.61614: results queue empty 15494 1726853363.61615: checking for any_errors_fatal 15494 1726853363.61619: done checking for any_errors_fatal 15494 1726853363.61620: checking for max_fail_percentage 15494 1726853363.61622: done checking for max_fail_percentage 15494 1726853363.61623: checking to see if all hosts have failed and the running result is not ok 15494 1726853363.61623: done checking to see if all hosts have failed 15494 1726853363.61624: getting the remaining hosts for this loop 15494 1726853363.61625: done getting the remaining hosts for this loop 15494 1726853363.61628: getting the next task for host managed_node1 15494 1726853363.61633: done getting next task for host managed_node1 15494 1726853363.61642: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15494 1726853363.61644: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853363.61659: getting variables 15494 1726853363.61661: in VariableManager get_vars() 15494 1726853363.61694: Calling all_inventory to load vars for managed_node1 15494 1726853363.61697: Calling groups_inventory to load vars for managed_node1 15494 1726853363.61698: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853363.61710: Calling all_plugins_play to load vars for managed_node1 15494 1726853363.61714: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853363.61718: Calling groups_plugins_play to load vars for managed_node1 15494 1726853363.62677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853363.63768: done with get_vars() 15494 1726853363.63786: done getting variables 15494 1726853363.63830: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:29:23 -0400 (0:00:00.036) 0:00:32.254 ****** 15494 1726853363.63862: entering _queue_task() for managed_node1/fail 15494 1726853363.64078: worker is 1 (out of 1 available) 15494 1726853363.64092: exiting _queue_task() for managed_node1/fail 15494 1726853363.64103: done queuing things up, now waiting for results queue to drain 15494 1726853363.64105: waiting for pending results... 15494 1726853363.64292: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15494 1726853363.64386: in run() - task 02083763-bbaf-0028-1a50-00000000005e 15494 1726853363.64390: variable 'ansible_search_path' from source: unknown 15494 1726853363.64394: variable 'ansible_search_path' from source: unknown 15494 1726853363.64411: calling self._execute() 15494 1726853363.64549: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853363.64553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853363.64560: variable 'omit' from source: magic vars 15494 1726853363.64810: variable 'ansible_distribution_major_version' from source: facts 15494 1726853363.64819: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853363.64939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853363.66488: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853363.66530: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853363.66560: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853363.66587: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853363.66607: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853363.66665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853363.66701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853363.66718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853363.66744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853363.66758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853363.66828: variable 'ansible_distribution_major_version' from source: facts 15494 1726853363.66840: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15494 1726853363.66922: variable 'ansible_distribution' from source: facts 15494 1726853363.66926: variable '__network_rh_distros' from source: role '' defaults 15494 1726853363.66933: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15494 1726853363.67096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853363.67115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853363.67167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853363.67220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853363.67224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853363.67239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853363.67258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853363.67275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853363.67299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853363.67310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853363.67341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853363.67360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853363.67405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853363.67418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853363.67432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853363.67677: variable 'network_connections' from source: play vars 15494 1726853363.67688: variable 'profile' from source: play vars 15494 1726853363.67728: variable 'profile' from source: play vars 15494 1726853363.67731: variable 'interface' from source: set_fact 15494 1726853363.67804: variable 'interface' from source: set_fact 15494 1726853363.67807: variable 'network_state' from source: role '' defaults 15494 1726853363.67854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853363.67960: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853363.67992: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853363.68043: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853363.68050: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853363.68088: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853363.68106: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853363.68124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853363.68140: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853363.68162: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15494 1726853363.68165: when evaluation is False, skipping this task 15494 1726853363.68168: _execute() done 15494 1726853363.68173: dumping result to json 15494 1726853363.68175: done dumping result, returning 15494 1726853363.68181: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-0028-1a50-00000000005e] 15494 1726853363.68184: sending task result for task 02083763-bbaf-0028-1a50-00000000005e 15494 1726853363.68269: done sending task result for task 02083763-bbaf-0028-1a50-00000000005e 15494 1726853363.68274: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15494 1726853363.68344: no more pending results, returning what we have 15494 1726853363.68348: results queue empty 15494 1726853363.68349: checking for any_errors_fatal 15494 1726853363.68356: done checking for any_errors_fatal 15494 1726853363.68357: checking for max_fail_percentage 15494 1726853363.68359: done checking for max_fail_percentage 15494 1726853363.68359: checking to see if all hosts have failed and the running result is not ok 15494 1726853363.68360: done checking to see if all hosts have failed 15494 1726853363.68361: getting the remaining hosts for this loop 15494 1726853363.68362: done getting the remaining hosts for this loop 15494 1726853363.68366: getting the next task for host managed_node1 15494 1726853363.68374: done getting next task for host managed_node1 15494 1726853363.68378: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15494 1726853363.68379: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853363.68394: getting variables 15494 1726853363.68395: in VariableManager get_vars() 15494 1726853363.68433: Calling all_inventory to load vars for managed_node1 15494 1726853363.68436: Calling groups_inventory to load vars for managed_node1 15494 1726853363.68438: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853363.68446: Calling all_plugins_play to load vars for managed_node1 15494 1726853363.68449: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853363.68451: Calling groups_plugins_play to load vars for managed_node1 15494 1726853363.69311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853363.71189: done with get_vars() 15494 1726853363.71225: done getting variables 15494 1726853363.71288: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:29:23 -0400 (0:00:00.074) 0:00:32.329 ****** 15494 1726853363.71322: entering _queue_task() for managed_node1/dnf 15494 1726853363.71651: worker is 1 (out of 1 available) 15494 1726853363.71670: exiting _queue_task() for managed_node1/dnf 15494 1726853363.71684: done queuing things up, now waiting for results queue to drain 15494 1726853363.71686: waiting for pending results... 15494 1726853363.71889: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15494 1726853363.71977: in run() - task 02083763-bbaf-0028-1a50-00000000005f 15494 1726853363.71989: variable 'ansible_search_path' from source: unknown 15494 1726853363.71993: variable 'ansible_search_path' from source: unknown 15494 1726853363.72025: calling self._execute() 15494 1726853363.72103: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853363.72108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853363.72116: variable 'omit' from source: magic vars 15494 1726853363.72401: variable 'ansible_distribution_major_version' from source: facts 15494 1726853363.72409: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853363.72544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853363.74492: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853363.74527: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853363.74569: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853363.74620: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853363.74655: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853363.74753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853363.74804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853363.74852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853363.74895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853363.74906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853363.75016: variable 'ansible_distribution' from source: facts 15494 1726853363.75020: variable 'ansible_distribution_major_version' from source: facts 15494 1726853363.75041: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15494 1726853363.75118: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853363.75211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853363.75227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853363.75249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853363.75284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853363.75296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853363.75346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853363.75366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853363.75385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853363.75409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853363.75419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853363.75447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853363.75468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853363.75487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853363.75510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853363.75520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853363.75624: variable 'network_connections' from source: play vars 15494 1726853363.75634: variable 'profile' from source: play vars 15494 1726853363.75681: variable 'profile' from source: play vars 15494 1726853363.75685: variable 'interface' from source: set_fact 15494 1726853363.75727: variable 'interface' from source: set_fact 15494 1726853363.75778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853363.75888: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853363.75918: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853363.75941: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853363.75966: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853363.76036: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853363.76063: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853363.76099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853363.76157: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853363.76169: variable '__network_team_connections_defined' from source: role '' defaults 15494 1726853363.76341: variable 'network_connections' from source: play vars 15494 1726853363.76344: variable 'profile' from source: play vars 15494 1726853363.76452: variable 'profile' from source: play vars 15494 1726853363.76455: variable 'interface' from source: set_fact 15494 1726853363.76480: variable 'interface' from source: set_fact 15494 1726853363.76497: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15494 1726853363.76500: when evaluation is False, skipping this task 15494 1726853363.76503: _execute() done 15494 1726853363.76505: dumping result to json 15494 1726853363.76509: done dumping result, returning 15494 1726853363.76516: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-0028-1a50-00000000005f] 15494 1726853363.76520: sending task result for task 02083763-bbaf-0028-1a50-00000000005f 15494 1726853363.76605: done sending task result for task 02083763-bbaf-0028-1a50-00000000005f 15494 1726853363.76608: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15494 1726853363.76655: no more pending results, returning what we have 15494 1726853363.76659: results queue empty 15494 1726853363.76660: checking for any_errors_fatal 15494 1726853363.76667: done checking for any_errors_fatal 15494 1726853363.76668: checking for max_fail_percentage 15494 1726853363.76670: done checking for max_fail_percentage 15494 1726853363.76672: checking to see if all hosts have failed and the running result is not ok 15494 1726853363.76673: done checking to see if all hosts have failed 15494 1726853363.76674: getting the remaining hosts for this loop 15494 1726853363.76675: done getting the remaining hosts for this loop 15494 1726853363.76679: getting the next task for host managed_node1 15494 1726853363.76685: done getting next task for host managed_node1 15494 1726853363.76689: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15494 1726853363.76691: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853363.76704: getting variables 15494 1726853363.76705: in VariableManager get_vars() 15494 1726853363.76742: Calling all_inventory to load vars for managed_node1 15494 1726853363.76745: Calling groups_inventory to load vars for managed_node1 15494 1726853363.76747: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853363.76756: Calling all_plugins_play to load vars for managed_node1 15494 1726853363.76759: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853363.76761: Calling groups_plugins_play to load vars for managed_node1 15494 1726853363.77903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853363.79341: done with get_vars() 15494 1726853363.79365: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15494 1726853363.79444: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:29:23 -0400 (0:00:00.081) 0:00:32.410 ****** 15494 1726853363.79476: entering _queue_task() for managed_node1/yum 15494 1726853363.79794: worker is 1 (out of 1 available) 15494 1726853363.79806: exiting _queue_task() for managed_node1/yum 15494 1726853363.79817: done queuing things up, now waiting for results queue to drain 15494 1726853363.79818: waiting for pending results... 15494 1726853363.80198: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15494 1726853363.80204: in run() - task 02083763-bbaf-0028-1a50-000000000060 15494 1726853363.80226: variable 'ansible_search_path' from source: unknown 15494 1726853363.80233: variable 'ansible_search_path' from source: unknown 15494 1726853363.80275: calling self._execute() 15494 1726853363.80379: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853363.80390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853363.80476: variable 'omit' from source: magic vars 15494 1726853363.81176: variable 'ansible_distribution_major_version' from source: facts 15494 1726853363.81180: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853363.81503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853363.83943: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853363.84019: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853363.84059: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853363.84099: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853363.84134: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853363.84215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853363.84266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853363.84298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853363.84347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853363.84368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853363.84470: variable 'ansible_distribution_major_version' from source: facts 15494 1726853363.84493: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15494 1726853363.84500: when evaluation is False, skipping this task 15494 1726853363.84507: _execute() done 15494 1726853363.84512: dumping result to json 15494 1726853363.84520: done dumping result, returning 15494 1726853363.84532: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-0028-1a50-000000000060] 15494 1726853363.84542: sending task result for task 02083763-bbaf-0028-1a50-000000000060 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15494 1726853363.84709: no more pending results, returning what we have 15494 1726853363.84713: results queue empty 15494 1726853363.84714: checking for any_errors_fatal 15494 1726853363.84723: done checking for any_errors_fatal 15494 1726853363.84724: checking for max_fail_percentage 15494 1726853363.84726: done checking for max_fail_percentage 15494 1726853363.84727: checking to see if all hosts have failed and the running result is not ok 15494 1726853363.84728: done checking to see if all hosts have failed 15494 1726853363.84729: getting the remaining hosts for this loop 15494 1726853363.84730: done getting the remaining hosts for this loop 15494 1726853363.84734: getting the next task for host managed_node1 15494 1726853363.84741: done getting next task for host managed_node1 15494 1726853363.84745: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15494 1726853363.84747: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853363.84761: getting variables 15494 1726853363.84763: in VariableManager get_vars() 15494 1726853363.84806: Calling all_inventory to load vars for managed_node1 15494 1726853363.84809: Calling groups_inventory to load vars for managed_node1 15494 1726853363.84812: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853363.84823: Calling all_plugins_play to load vars for managed_node1 15494 1726853363.84825: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853363.84828: Calling groups_plugins_play to load vars for managed_node1 15494 1726853363.85586: done sending task result for task 02083763-bbaf-0028-1a50-000000000060 15494 1726853363.85589: WORKER PROCESS EXITING 15494 1726853363.86643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853363.89313: done with get_vars() 15494 1726853363.89342: done getting variables 15494 1726853363.89400: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:29:23 -0400 (0:00:00.099) 0:00:32.510 ****** 15494 1726853363.89433: entering _queue_task() for managed_node1/fail 15494 1726853363.90196: worker is 1 (out of 1 available) 15494 1726853363.90208: exiting _queue_task() for managed_node1/fail 15494 1726853363.90220: done queuing things up, now waiting for results queue to drain 15494 1726853363.90221: waiting for pending results... 15494 1726853363.90756: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15494 1726853363.91123: in run() - task 02083763-bbaf-0028-1a50-000000000061 15494 1726853363.91128: variable 'ansible_search_path' from source: unknown 15494 1726853363.91130: variable 'ansible_search_path' from source: unknown 15494 1726853363.91175: calling self._execute() 15494 1726853363.91434: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853363.91483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853363.91500: variable 'omit' from source: magic vars 15494 1726853363.92178: variable 'ansible_distribution_major_version' from source: facts 15494 1726853363.92196: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853363.92328: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853363.92577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853363.94843: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853363.94912: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853363.94959: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853363.95001: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853363.95043: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853363.95120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853363.95260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853363.95263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853363.95266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853363.95268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853363.95312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853363.95338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853363.95375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853363.95419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853363.95440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853363.95493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853363.95521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853363.95548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853363.95599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853363.95618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853363.95788: variable 'network_connections' from source: play vars 15494 1726853363.95817: variable 'profile' from source: play vars 15494 1726853363.95892: variable 'profile' from source: play vars 15494 1726853363.95913: variable 'interface' from source: set_fact 15494 1726853363.95970: variable 'interface' from source: set_fact 15494 1726853363.96055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853363.96229: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853363.96278: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853363.96351: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853363.96354: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853363.96397: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853363.96423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853363.96460: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853363.96494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853363.96568: variable '__network_team_connections_defined' from source: role '' defaults 15494 1726853363.96804: variable 'network_connections' from source: play vars 15494 1726853363.96814: variable 'profile' from source: play vars 15494 1726853363.96896: variable 'profile' from source: play vars 15494 1726853363.96899: variable 'interface' from source: set_fact 15494 1726853363.96947: variable 'interface' from source: set_fact 15494 1726853363.96979: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15494 1726853363.97004: when evaluation is False, skipping this task 15494 1726853363.97007: _execute() done 15494 1726853363.97010: dumping result to json 15494 1726853363.97013: done dumping result, returning 15494 1726853363.97176: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-0028-1a50-000000000061] 15494 1726853363.97188: sending task result for task 02083763-bbaf-0028-1a50-000000000061 15494 1726853363.97254: done sending task result for task 02083763-bbaf-0028-1a50-000000000061 15494 1726853363.97257: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15494 1726853363.97309: no more pending results, returning what we have 15494 1726853363.97313: results queue empty 15494 1726853363.97314: checking for any_errors_fatal 15494 1726853363.97319: done checking for any_errors_fatal 15494 1726853363.97320: checking for max_fail_percentage 15494 1726853363.97321: done checking for max_fail_percentage 15494 1726853363.97322: checking to see if all hosts have failed and the running result is not ok 15494 1726853363.97323: done checking to see if all hosts have failed 15494 1726853363.97324: getting the remaining hosts for this loop 15494 1726853363.97325: done getting the remaining hosts for this loop 15494 1726853363.97329: getting the next task for host managed_node1 15494 1726853363.97336: done getting next task for host managed_node1 15494 1726853363.97340: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15494 1726853363.97342: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853363.97355: getting variables 15494 1726853363.97357: in VariableManager get_vars() 15494 1726853363.97403: Calling all_inventory to load vars for managed_node1 15494 1726853363.97405: Calling groups_inventory to load vars for managed_node1 15494 1726853363.97408: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853363.97419: Calling all_plugins_play to load vars for managed_node1 15494 1726853363.97422: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853363.97426: Calling groups_plugins_play to load vars for managed_node1 15494 1726853363.99012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853364.00617: done with get_vars() 15494 1726853364.00641: done getting variables 15494 1726853364.00707: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:29:24 -0400 (0:00:00.113) 0:00:32.623 ****** 15494 1726853364.00740: entering _queue_task() for managed_node1/package 15494 1726853364.01305: worker is 1 (out of 1 available) 15494 1726853364.01315: exiting _queue_task() for managed_node1/package 15494 1726853364.01328: done queuing things up, now waiting for results queue to drain 15494 1726853364.01329: waiting for pending results... 15494 1726853364.01587: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 15494 1726853364.01650: in run() - task 02083763-bbaf-0028-1a50-000000000062 15494 1726853364.01681: variable 'ansible_search_path' from source: unknown 15494 1726853364.01787: variable 'ansible_search_path' from source: unknown 15494 1726853364.01791: calling self._execute() 15494 1726853364.01844: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853364.01855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853364.01894: variable 'omit' from source: magic vars 15494 1726853364.02257: variable 'ansible_distribution_major_version' from source: facts 15494 1726853364.02276: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853364.02480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853364.02872: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853364.02876: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853364.02879: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853364.02928: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853364.03056: variable 'network_packages' from source: role '' defaults 15494 1726853364.03181: variable '__network_provider_setup' from source: role '' defaults 15494 1726853364.03208: variable '__network_service_name_default_nm' from source: role '' defaults 15494 1726853364.03302: variable '__network_service_name_default_nm' from source: role '' defaults 15494 1726853364.03306: variable '__network_packages_default_nm' from source: role '' defaults 15494 1726853364.03357: variable '__network_packages_default_nm' from source: role '' defaults 15494 1726853364.03552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853364.05977: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853364.06123: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853364.06130: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853364.06141: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853364.06172: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853364.06277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853364.06311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853364.06354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853364.06650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853364.06880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853364.06883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853364.06894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853364.06922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853364.06963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853364.07005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853364.07531: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15494 1726853364.07691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853364.07785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853364.07857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853364.07941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853364.07995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853364.08209: variable 'ansible_python' from source: facts 15494 1726853364.08240: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15494 1726853364.08593: variable '__network_wpa_supplicant_required' from source: role '' defaults 15494 1726853364.08596: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15494 1726853364.08932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853364.08966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853364.09000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853364.09140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853364.09162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853364.09277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853364.09290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853364.09380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853364.09429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853364.09480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853364.09745: variable 'network_connections' from source: play vars 15494 1726853364.09976: variable 'profile' from source: play vars 15494 1726853364.09979: variable 'profile' from source: play vars 15494 1726853364.09981: variable 'interface' from source: set_fact 15494 1726853364.09988: variable 'interface' from source: set_fact 15494 1726853364.10324: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853364.10327: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853364.10330: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853364.10393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853364.10499: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853364.11068: variable 'network_connections' from source: play vars 15494 1726853364.11185: variable 'profile' from source: play vars 15494 1726853364.11395: variable 'profile' from source: play vars 15494 1726853364.11415: variable 'interface' from source: set_fact 15494 1726853364.11491: variable 'interface' from source: set_fact 15494 1726853364.11612: variable '__network_packages_default_wireless' from source: role '' defaults 15494 1726853364.11844: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853364.12538: variable 'network_connections' from source: play vars 15494 1726853364.12542: variable 'profile' from source: play vars 15494 1726853364.12692: variable 'profile' from source: play vars 15494 1726853364.12702: variable 'interface' from source: set_fact 15494 1726853364.12975: variable 'interface' from source: set_fact 15494 1726853364.12998: variable '__network_packages_default_team' from source: role '' defaults 15494 1726853364.13112: variable '__network_team_connections_defined' from source: role '' defaults 15494 1726853364.13648: variable 'network_connections' from source: play vars 15494 1726853364.13660: variable 'profile' from source: play vars 15494 1726853364.13733: variable 'profile' from source: play vars 15494 1726853364.13742: variable 'interface' from source: set_fact 15494 1726853364.13848: variable 'interface' from source: set_fact 15494 1726853364.13904: variable '__network_service_name_default_initscripts' from source: role '' defaults 15494 1726853364.13979: variable '__network_service_name_default_initscripts' from source: role '' defaults 15494 1726853364.13993: variable '__network_packages_default_initscripts' from source: role '' defaults 15494 1726853364.14050: variable '__network_packages_default_initscripts' from source: role '' defaults 15494 1726853364.14268: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15494 1726853364.14764: variable 'network_connections' from source: play vars 15494 1726853364.14824: variable 'profile' from source: play vars 15494 1726853364.14843: variable 'profile' from source: play vars 15494 1726853364.14853: variable 'interface' from source: set_fact 15494 1726853364.14923: variable 'interface' from source: set_fact 15494 1726853364.14947: variable 'ansible_distribution' from source: facts 15494 1726853364.14957: variable '__network_rh_distros' from source: role '' defaults 15494 1726853364.14967: variable 'ansible_distribution_major_version' from source: facts 15494 1726853364.14990: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15494 1726853364.15176: variable 'ansible_distribution' from source: facts 15494 1726853364.15179: variable '__network_rh_distros' from source: role '' defaults 15494 1726853364.15261: variable 'ansible_distribution_major_version' from source: facts 15494 1726853364.15264: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15494 1726853364.15377: variable 'ansible_distribution' from source: facts 15494 1726853364.15386: variable '__network_rh_distros' from source: role '' defaults 15494 1726853364.15396: variable 'ansible_distribution_major_version' from source: facts 15494 1726853364.15436: variable 'network_provider' from source: set_fact 15494 1726853364.15459: variable 'ansible_facts' from source: unknown 15494 1726853364.16187: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15494 1726853364.16198: when evaluation is False, skipping this task 15494 1726853364.16206: _execute() done 15494 1726853364.16356: dumping result to json 15494 1726853364.16359: done dumping result, returning 15494 1726853364.16362: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-0028-1a50-000000000062] 15494 1726853364.16364: sending task result for task 02083763-bbaf-0028-1a50-000000000062 15494 1726853364.16442: done sending task result for task 02083763-bbaf-0028-1a50-000000000062 15494 1726853364.16446: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15494 1726853364.16501: no more pending results, returning what we have 15494 1726853364.16506: results queue empty 15494 1726853364.16507: checking for any_errors_fatal 15494 1726853364.16514: done checking for any_errors_fatal 15494 1726853364.16515: checking for max_fail_percentage 15494 1726853364.16517: done checking for max_fail_percentage 15494 1726853364.16518: checking to see if all hosts have failed and the running result is not ok 15494 1726853364.16519: done checking to see if all hosts have failed 15494 1726853364.16520: getting the remaining hosts for this loop 15494 1726853364.16521: done getting the remaining hosts for this loop 15494 1726853364.16525: getting the next task for host managed_node1 15494 1726853364.16533: done getting next task for host managed_node1 15494 1726853364.16537: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15494 1726853364.16539: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853364.16553: getting variables 15494 1726853364.16555: in VariableManager get_vars() 15494 1726853364.16596: Calling all_inventory to load vars for managed_node1 15494 1726853364.16599: Calling groups_inventory to load vars for managed_node1 15494 1726853364.16601: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853364.16618: Calling all_plugins_play to load vars for managed_node1 15494 1726853364.16621: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853364.16623: Calling groups_plugins_play to load vars for managed_node1 15494 1726853364.18535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853364.20130: done with get_vars() 15494 1726853364.20163: done getting variables 15494 1726853364.20226: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:29:24 -0400 (0:00:00.195) 0:00:32.818 ****** 15494 1726853364.20293: entering _queue_task() for managed_node1/package 15494 1726853364.20682: worker is 1 (out of 1 available) 15494 1726853364.20695: exiting _queue_task() for managed_node1/package 15494 1726853364.20713: done queuing things up, now waiting for results queue to drain 15494 1726853364.20719: waiting for pending results... 15494 1726853364.21026: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15494 1726853364.21098: in run() - task 02083763-bbaf-0028-1a50-000000000063 15494 1726853364.21126: variable 'ansible_search_path' from source: unknown 15494 1726853364.21135: variable 'ansible_search_path' from source: unknown 15494 1726853364.21182: calling self._execute() 15494 1726853364.21290: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853364.21293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853364.21304: variable 'omit' from source: magic vars 15494 1726853364.21602: variable 'ansible_distribution_major_version' from source: facts 15494 1726853364.21611: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853364.21698: variable 'network_state' from source: role '' defaults 15494 1726853364.21707: Evaluated conditional (network_state != {}): False 15494 1726853364.21710: when evaluation is False, skipping this task 15494 1726853364.21713: _execute() done 15494 1726853364.21715: dumping result to json 15494 1726853364.21718: done dumping result, returning 15494 1726853364.21724: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-0028-1a50-000000000063] 15494 1726853364.21729: sending task result for task 02083763-bbaf-0028-1a50-000000000063 15494 1726853364.21820: done sending task result for task 02083763-bbaf-0028-1a50-000000000063 15494 1726853364.21822: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15494 1726853364.21884: no more pending results, returning what we have 15494 1726853364.21888: results queue empty 15494 1726853364.21889: checking for any_errors_fatal 15494 1726853364.21898: done checking for any_errors_fatal 15494 1726853364.21899: checking for max_fail_percentage 15494 1726853364.21901: done checking for max_fail_percentage 15494 1726853364.21902: checking to see if all hosts have failed and the running result is not ok 15494 1726853364.21902: done checking to see if all hosts have failed 15494 1726853364.21903: getting the remaining hosts for this loop 15494 1726853364.21905: done getting the remaining hosts for this loop 15494 1726853364.21908: getting the next task for host managed_node1 15494 1726853364.21915: done getting next task for host managed_node1 15494 1726853364.21919: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15494 1726853364.21920: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853364.21943: getting variables 15494 1726853364.21945: in VariableManager get_vars() 15494 1726853364.21980: Calling all_inventory to load vars for managed_node1 15494 1726853364.21982: Calling groups_inventory to load vars for managed_node1 15494 1726853364.21984: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853364.21993: Calling all_plugins_play to load vars for managed_node1 15494 1726853364.21995: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853364.21997: Calling groups_plugins_play to load vars for managed_node1 15494 1726853364.22773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853364.23697: done with get_vars() 15494 1726853364.23718: done getting variables 15494 1726853364.23775: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:29:24 -0400 (0:00:00.035) 0:00:32.853 ****** 15494 1726853364.23809: entering _queue_task() for managed_node1/package 15494 1726853364.24090: worker is 1 (out of 1 available) 15494 1726853364.24100: exiting _queue_task() for managed_node1/package 15494 1726853364.24112: done queuing things up, now waiting for results queue to drain 15494 1726853364.24113: waiting for pending results... 15494 1726853364.24417: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15494 1726853364.24476: in run() - task 02083763-bbaf-0028-1a50-000000000064 15494 1726853364.24505: variable 'ansible_search_path' from source: unknown 15494 1726853364.24511: variable 'ansible_search_path' from source: unknown 15494 1726853364.24544: calling self._execute() 15494 1726853364.24632: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853364.24636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853364.24663: variable 'omit' from source: magic vars 15494 1726853364.25076: variable 'ansible_distribution_major_version' from source: facts 15494 1726853364.25080: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853364.25136: variable 'network_state' from source: role '' defaults 15494 1726853364.25152: Evaluated conditional (network_state != {}): False 15494 1726853364.25160: when evaluation is False, skipping this task 15494 1726853364.25166: _execute() done 15494 1726853364.25175: dumping result to json 15494 1726853364.25183: done dumping result, returning 15494 1726853364.25194: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-0028-1a50-000000000064] 15494 1726853364.25201: sending task result for task 02083763-bbaf-0028-1a50-000000000064 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15494 1726853364.25343: no more pending results, returning what we have 15494 1726853364.25348: results queue empty 15494 1726853364.25349: checking for any_errors_fatal 15494 1726853364.25358: done checking for any_errors_fatal 15494 1726853364.25359: checking for max_fail_percentage 15494 1726853364.25361: done checking for max_fail_percentage 15494 1726853364.25362: checking to see if all hosts have failed and the running result is not ok 15494 1726853364.25363: done checking to see if all hosts have failed 15494 1726853364.25364: getting the remaining hosts for this loop 15494 1726853364.25365: done getting the remaining hosts for this loop 15494 1726853364.25369: getting the next task for host managed_node1 15494 1726853364.25379: done getting next task for host managed_node1 15494 1726853364.25383: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15494 1726853364.25385: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853364.25400: getting variables 15494 1726853364.25402: in VariableManager get_vars() 15494 1726853364.25440: Calling all_inventory to load vars for managed_node1 15494 1726853364.25443: Calling groups_inventory to load vars for managed_node1 15494 1726853364.25445: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853364.25458: Calling all_plugins_play to load vars for managed_node1 15494 1726853364.25461: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853364.25465: Calling groups_plugins_play to load vars for managed_node1 15494 1726853364.25773: done sending task result for task 02083763-bbaf-0028-1a50-000000000064 15494 1726853364.25777: WORKER PROCESS EXITING 15494 1726853364.26687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853364.27580: done with get_vars() 15494 1726853364.27596: done getting variables 15494 1726853364.27641: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:29:24 -0400 (0:00:00.038) 0:00:32.892 ****** 15494 1726853364.27667: entering _queue_task() for managed_node1/service 15494 1726853364.27901: worker is 1 (out of 1 available) 15494 1726853364.27913: exiting _queue_task() for managed_node1/service 15494 1726853364.27925: done queuing things up, now waiting for results queue to drain 15494 1726853364.27926: waiting for pending results... 15494 1726853364.28112: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15494 1726853364.28188: in run() - task 02083763-bbaf-0028-1a50-000000000065 15494 1726853364.28199: variable 'ansible_search_path' from source: unknown 15494 1726853364.28203: variable 'ansible_search_path' from source: unknown 15494 1726853364.28232: calling self._execute() 15494 1726853364.28312: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853364.28316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853364.28325: variable 'omit' from source: magic vars 15494 1726853364.28876: variable 'ansible_distribution_major_version' from source: facts 15494 1726853364.28880: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853364.28883: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853364.29055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853364.30685: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853364.30729: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853364.30760: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853364.30787: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853364.30806: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853364.30883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853364.30913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853364.30930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853364.30958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853364.30972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853364.31005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853364.31021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853364.31038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853364.31064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853364.31079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853364.31106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853364.31121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853364.31138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853364.31163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853364.31176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853364.31289: variable 'network_connections' from source: play vars 15494 1726853364.31300: variable 'profile' from source: play vars 15494 1726853364.31344: variable 'profile' from source: play vars 15494 1726853364.31350: variable 'interface' from source: set_fact 15494 1726853364.31391: variable 'interface' from source: set_fact 15494 1726853364.31442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853364.31551: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853364.31577: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853364.31601: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853364.31626: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853364.31653: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853364.31668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853364.31687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853364.31704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853364.31743: variable '__network_team_connections_defined' from source: role '' defaults 15494 1726853364.31895: variable 'network_connections' from source: play vars 15494 1726853364.31899: variable 'profile' from source: play vars 15494 1726853364.31943: variable 'profile' from source: play vars 15494 1726853364.31949: variable 'interface' from source: set_fact 15494 1726853364.31991: variable 'interface' from source: set_fact 15494 1726853364.32010: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15494 1726853364.32013: when evaluation is False, skipping this task 15494 1726853364.32015: _execute() done 15494 1726853364.32018: dumping result to json 15494 1726853364.32020: done dumping result, returning 15494 1726853364.32027: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-0028-1a50-000000000065] 15494 1726853364.32039: sending task result for task 02083763-bbaf-0028-1a50-000000000065 15494 1726853364.32115: done sending task result for task 02083763-bbaf-0028-1a50-000000000065 15494 1726853364.32117: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15494 1726853364.32194: no more pending results, returning what we have 15494 1726853364.32198: results queue empty 15494 1726853364.32199: checking for any_errors_fatal 15494 1726853364.32204: done checking for any_errors_fatal 15494 1726853364.32205: checking for max_fail_percentage 15494 1726853364.32207: done checking for max_fail_percentage 15494 1726853364.32207: checking to see if all hosts have failed and the running result is not ok 15494 1726853364.32208: done checking to see if all hosts have failed 15494 1726853364.32209: getting the remaining hosts for this loop 15494 1726853364.32210: done getting the remaining hosts for this loop 15494 1726853364.32214: getting the next task for host managed_node1 15494 1726853364.32220: done getting next task for host managed_node1 15494 1726853364.32223: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15494 1726853364.32227: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853364.32239: getting variables 15494 1726853364.32241: in VariableManager get_vars() 15494 1726853364.32278: Calling all_inventory to load vars for managed_node1 15494 1726853364.32280: Calling groups_inventory to load vars for managed_node1 15494 1726853364.32282: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853364.32290: Calling all_plugins_play to load vars for managed_node1 15494 1726853364.32292: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853364.32295: Calling groups_plugins_play to load vars for managed_node1 15494 1726853364.33415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853364.35356: done with get_vars() 15494 1726853364.35390: done getting variables 15494 1726853364.35438: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:29:24 -0400 (0:00:00.077) 0:00:32.970 ****** 15494 1726853364.35476: entering _queue_task() for managed_node1/service 15494 1726853364.35922: worker is 1 (out of 1 available) 15494 1726853364.35937: exiting _queue_task() for managed_node1/service 15494 1726853364.35950: done queuing things up, now waiting for results queue to drain 15494 1726853364.35951: waiting for pending results... 15494 1726853364.36215: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15494 1726853364.36379: in run() - task 02083763-bbaf-0028-1a50-000000000066 15494 1726853364.36384: variable 'ansible_search_path' from source: unknown 15494 1726853364.36387: variable 'ansible_search_path' from source: unknown 15494 1726853364.36391: calling self._execute() 15494 1726853364.36514: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853364.36518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853364.36535: variable 'omit' from source: magic vars 15494 1726853364.36866: variable 'ansible_distribution_major_version' from source: facts 15494 1726853364.36888: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853364.37009: variable 'network_provider' from source: set_fact 15494 1726853364.37013: variable 'network_state' from source: role '' defaults 15494 1726853364.37021: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15494 1726853364.37027: variable 'omit' from source: magic vars 15494 1726853364.37059: variable 'omit' from source: magic vars 15494 1726853364.37213: variable 'network_service_name' from source: role '' defaults 15494 1726853364.37216: variable 'network_service_name' from source: role '' defaults 15494 1726853364.37342: variable '__network_provider_setup' from source: role '' defaults 15494 1726853364.37345: variable '__network_service_name_default_nm' from source: role '' defaults 15494 1726853364.37396: variable '__network_service_name_default_nm' from source: role '' defaults 15494 1726853364.37435: variable '__network_packages_default_nm' from source: role '' defaults 15494 1726853364.37480: variable '__network_packages_default_nm' from source: role '' defaults 15494 1726853364.37876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853364.40214: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853364.40289: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853364.40331: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853364.40386: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853364.40422: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853364.40513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853364.40551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853364.40588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853364.40638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853364.40661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853364.40741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853364.40775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853364.40807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853364.40857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853364.40878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853364.41110: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15494 1726853364.41234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853364.41264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853364.41300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853364.41343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853364.41366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853364.41442: variable 'ansible_python' from source: facts 15494 1726853364.41462: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15494 1726853364.41527: variable '__network_wpa_supplicant_required' from source: role '' defaults 15494 1726853364.41577: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15494 1726853364.41660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853364.41679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853364.41697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853364.41721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853364.41731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853364.41770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853364.41796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853364.41814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853364.41839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853364.41852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853364.41946: variable 'network_connections' from source: play vars 15494 1726853364.41952: variable 'profile' from source: play vars 15494 1726853364.42011: variable 'profile' from source: play vars 15494 1726853364.42030: variable 'interface' from source: set_fact 15494 1726853364.42140: variable 'interface' from source: set_fact 15494 1726853364.42225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853364.42459: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853364.42462: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853364.42494: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853364.42535: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853364.42676: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853364.42679: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853364.42681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853364.42697: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853364.42793: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853364.43043: variable 'network_connections' from source: play vars 15494 1726853364.43047: variable 'profile' from source: play vars 15494 1726853364.43098: variable 'profile' from source: play vars 15494 1726853364.43101: variable 'interface' from source: set_fact 15494 1726853364.43153: variable 'interface' from source: set_fact 15494 1726853364.43208: variable '__network_packages_default_wireless' from source: role '' defaults 15494 1726853364.43243: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853364.43509: variable 'network_connections' from source: play vars 15494 1726853364.43519: variable 'profile' from source: play vars 15494 1726853364.43579: variable 'profile' from source: play vars 15494 1726853364.43583: variable 'interface' from source: set_fact 15494 1726853364.43633: variable 'interface' from source: set_fact 15494 1726853364.43678: variable '__network_packages_default_team' from source: role '' defaults 15494 1726853364.43727: variable '__network_team_connections_defined' from source: role '' defaults 15494 1726853364.43978: variable 'network_connections' from source: play vars 15494 1726853364.43981: variable 'profile' from source: play vars 15494 1726853364.44030: variable 'profile' from source: play vars 15494 1726853364.44035: variable 'interface' from source: set_fact 15494 1726853364.44099: variable 'interface' from source: set_fact 15494 1726853364.44151: variable '__network_service_name_default_initscripts' from source: role '' defaults 15494 1726853364.44214: variable '__network_service_name_default_initscripts' from source: role '' defaults 15494 1726853364.44218: variable '__network_packages_default_initscripts' from source: role '' defaults 15494 1726853364.44296: variable '__network_packages_default_initscripts' from source: role '' defaults 15494 1726853364.44426: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15494 1726853364.44982: variable 'network_connections' from source: play vars 15494 1726853364.44986: variable 'profile' from source: play vars 15494 1726853364.44988: variable 'profile' from source: play vars 15494 1726853364.44990: variable 'interface' from source: set_fact 15494 1726853364.45047: variable 'interface' from source: set_fact 15494 1726853364.45062: variable 'ansible_distribution' from source: facts 15494 1726853364.45075: variable '__network_rh_distros' from source: role '' defaults 15494 1726853364.45109: variable 'ansible_distribution_major_version' from source: facts 15494 1726853364.45150: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15494 1726853364.45356: variable 'ansible_distribution' from source: facts 15494 1726853364.45367: variable '__network_rh_distros' from source: role '' defaults 15494 1726853364.45381: variable 'ansible_distribution_major_version' from source: facts 15494 1726853364.45398: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15494 1726853364.45575: variable 'ansible_distribution' from source: facts 15494 1726853364.45584: variable '__network_rh_distros' from source: role '' defaults 15494 1726853364.45593: variable 'ansible_distribution_major_version' from source: facts 15494 1726853364.45636: variable 'network_provider' from source: set_fact 15494 1726853364.45876: variable 'omit' from source: magic vars 15494 1726853364.45880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853364.45883: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853364.45885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853364.45886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853364.45889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853364.45890: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853364.45892: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853364.45894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853364.45904: Set connection var ansible_connection to ssh 15494 1726853364.45914: Set connection var ansible_pipelining to False 15494 1726853364.45923: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853364.45929: Set connection var ansible_shell_type to sh 15494 1726853364.45938: Set connection var ansible_timeout to 10 15494 1726853364.45951: Set connection var ansible_shell_executable to /bin/sh 15494 1726853364.45987: variable 'ansible_shell_executable' from source: unknown 15494 1726853364.45999: variable 'ansible_connection' from source: unknown 15494 1726853364.46011: variable 'ansible_module_compression' from source: unknown 15494 1726853364.46014: variable 'ansible_shell_type' from source: unknown 15494 1726853364.46017: variable 'ansible_shell_executable' from source: unknown 15494 1726853364.46019: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853364.46025: variable 'ansible_pipelining' from source: unknown 15494 1726853364.46027: variable 'ansible_timeout' from source: unknown 15494 1726853364.46029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853364.46112: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853364.46127: variable 'omit' from source: magic vars 15494 1726853364.46132: starting attempt loop 15494 1726853364.46135: running the handler 15494 1726853364.46196: variable 'ansible_facts' from source: unknown 15494 1726853364.46705: _low_level_execute_command(): starting 15494 1726853364.46708: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853364.47226: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853364.47230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15494 1726853364.47232: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853364.47235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853364.47282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853364.47296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853364.47352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853364.49010: stdout chunk (state=3): >>>/root <<< 15494 1726853364.49114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853364.49160: stderr chunk (state=3): >>><<< 15494 1726853364.49163: stdout chunk (state=3): >>><<< 15494 1726853364.49198: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853364.49201: _low_level_execute_command(): starting 15494 1726853364.49267: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853364.491904-16975-265142481383404 `" && echo ansible-tmp-1726853364.491904-16975-265142481383404="` echo /root/.ansible/tmp/ansible-tmp-1726853364.491904-16975-265142481383404 `" ) && sleep 0' 15494 1726853364.49889: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853364.49976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853364.49995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853364.50056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853364.50104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853364.52022: stdout chunk (state=3): >>>ansible-tmp-1726853364.491904-16975-265142481383404=/root/.ansible/tmp/ansible-tmp-1726853364.491904-16975-265142481383404 <<< 15494 1726853364.52202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853364.52205: stdout chunk (state=3): >>><<< 15494 1726853364.52208: stderr chunk (state=3): >>><<< 15494 1726853364.52377: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853364.491904-16975-265142481383404=/root/.ansible/tmp/ansible-tmp-1726853364.491904-16975-265142481383404 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853364.52380: variable 'ansible_module_compression' from source: unknown 15494 1726853364.52383: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15494 1726853364.52442: variable 'ansible_facts' from source: unknown 15494 1726853364.52695: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853364.491904-16975-265142481383404/AnsiballZ_systemd.py 15494 1726853364.53228: Sending initial data 15494 1726853364.53288: Sent initial data (155 bytes) 15494 1726853364.54185: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853364.54218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853364.54230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853364.54250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853364.54311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853364.56031: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853364.56036: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853364.56054: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp8ngmpzop /root/.ansible/tmp/ansible-tmp-1726853364.491904-16975-265142481383404/AnsiballZ_systemd.py <<< 15494 1726853364.56091: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853364.491904-16975-265142481383404/AnsiballZ_systemd.py" <<< 15494 1726853364.56095: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp8ngmpzop" to remote "/root/.ansible/tmp/ansible-tmp-1726853364.491904-16975-265142481383404/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853364.491904-16975-265142481383404/AnsiballZ_systemd.py" <<< 15494 1726853364.57670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853364.57676: stdout chunk (state=3): >>><<< 15494 1726853364.57679: stderr chunk (state=3): >>><<< 15494 1726853364.57681: done transferring module to remote 15494 1726853364.57683: _low_level_execute_command(): starting 15494 1726853364.57686: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853364.491904-16975-265142481383404/ /root/.ansible/tmp/ansible-tmp-1726853364.491904-16975-265142481383404/AnsiballZ_systemd.py && sleep 0' 15494 1726853364.58401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853364.58421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853364.58452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853364.58602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853364.60522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853364.60526: stdout chunk (state=3): >>><<< 15494 1726853364.60528: stderr chunk (state=3): >>><<< 15494 1726853364.60531: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853364.60533: _low_level_execute_command(): starting 15494 1726853364.60536: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853364.491904-16975-265142481383404/AnsiballZ_systemd.py && sleep 0' 15494 1726853364.61702: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853364.61705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853364.61708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853364.61710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853364.61712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853364.61714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853364.61878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853364.61903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853364.62037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853364.91057: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10604544", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313655808", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "769476000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 15494 1726853364.91099: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15494 1726853364.93220: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853364.93224: stdout chunk (state=3): >>><<< 15494 1726853364.93226: stderr chunk (state=3): >>><<< 15494 1726853364.93230: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "702", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainStartTimestampMonotonic": "13747067", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ExecMainHandoffTimestampMonotonic": "13825256", "ExecMainPID": "702", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2977", "MemoryCurrent": "10604544", "MemoryPeak": "14561280", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3313655808", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "769476000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target multi-user.target network.target cloud-init.service", "After": "cloud-init-local.service systemd-journald.socket sysinit.target dbus.socket dbus-broker.service basic.target system.slice network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:27:21 EDT", "StateChangeTimestampMonotonic": "407641563", "InactiveExitTimestamp": "Fri 2024-09-20 13:20:47 EDT", "InactiveExitTimestampMonotonic": "13748890", "ActiveEnterTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ActiveEnterTimestampMonotonic": "14166608", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:20:47 EDT", "ConditionTimestampMonotonic": "13745559", "AssertTimestamp": "Fri 2024-09-20 13:20:47 EDT", "AssertTimestampMonotonic": "13745562", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "5f58decfa480494eac8aa3993b4c7ec8", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853364.93625: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853364.491904-16975-265142481383404/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853364.93655: _low_level_execute_command(): starting 15494 1726853364.93661: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853364.491904-16975-265142481383404/ > /dev/null 2>&1 && sleep 0' 15494 1726853364.94318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853364.94321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853364.94324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853364.94326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853364.94328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853364.94331: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853364.94333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853364.94335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853364.94337: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 15494 1726853364.94419: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15494 1726853364.94422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853364.94425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853364.94427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853364.94429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853364.94431: stderr chunk (state=3): >>>debug2: match found <<< 15494 1726853364.94432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853364.94529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853364.94533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853364.94535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853364.94685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853364.96445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853364.96489: stderr chunk (state=3): >>><<< 15494 1726853364.96493: stdout chunk (state=3): >>><<< 15494 1726853364.96513: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853364.96516: handler run complete 15494 1726853364.96578: attempt loop complete, returning result 15494 1726853364.96581: _execute() done 15494 1726853364.96584: dumping result to json 15494 1726853364.96601: done dumping result, returning 15494 1726853364.96621: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-0028-1a50-000000000066] 15494 1726853364.96625: sending task result for task 02083763-bbaf-0028-1a50-000000000066 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15494 1726853364.97293: no more pending results, returning what we have 15494 1726853364.97297: results queue empty 15494 1726853364.97298: checking for any_errors_fatal 15494 1726853364.97305: done checking for any_errors_fatal 15494 1726853364.97306: checking for max_fail_percentage 15494 1726853364.97308: done checking for max_fail_percentage 15494 1726853364.97308: checking to see if all hosts have failed and the running result is not ok 15494 1726853364.97309: done checking to see if all hosts have failed 15494 1726853364.97310: getting the remaining hosts for this loop 15494 1726853364.97312: done getting the remaining hosts for this loop 15494 1726853364.97316: getting the next task for host managed_node1 15494 1726853364.97323: done getting next task for host managed_node1 15494 1726853364.97328: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15494 1726853364.97330: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853364.97340: getting variables 15494 1726853364.97342: in VariableManager get_vars() 15494 1726853364.97451: Calling all_inventory to load vars for managed_node1 15494 1726853364.97454: Calling groups_inventory to load vars for managed_node1 15494 1726853364.97457: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853364.97468: Calling all_plugins_play to load vars for managed_node1 15494 1726853364.97598: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853364.97606: done sending task result for task 02083763-bbaf-0028-1a50-000000000066 15494 1726853364.97609: WORKER PROCESS EXITING 15494 1726853364.97614: Calling groups_plugins_play to load vars for managed_node1 15494 1726853364.99092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853365.01410: done with get_vars() 15494 1726853365.01440: done getting variables 15494 1726853365.01499: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:29:25 -0400 (0:00:00.660) 0:00:33.631 ****** 15494 1726853365.01531: entering _queue_task() for managed_node1/service 15494 1726853365.02173: worker is 1 (out of 1 available) 15494 1726853365.02187: exiting _queue_task() for managed_node1/service 15494 1726853365.02198: done queuing things up, now waiting for results queue to drain 15494 1726853365.02199: waiting for pending results... 15494 1726853365.02474: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15494 1726853365.02609: in run() - task 02083763-bbaf-0028-1a50-000000000067 15494 1726853365.02632: variable 'ansible_search_path' from source: unknown 15494 1726853365.02641: variable 'ansible_search_path' from source: unknown 15494 1726853365.02744: calling self._execute() 15494 1726853365.02882: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853365.02893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853365.02909: variable 'omit' from source: magic vars 15494 1726853365.03313: variable 'ansible_distribution_major_version' from source: facts 15494 1726853365.03330: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853365.03470: variable 'network_provider' from source: set_fact 15494 1726853365.03474: Evaluated conditional (network_provider == "nm"): True 15494 1726853365.03556: variable '__network_wpa_supplicant_required' from source: role '' defaults 15494 1726853365.03648: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15494 1726853365.03877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853365.06298: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853365.06370: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853365.06413: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853365.06453: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853365.06487: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853365.06642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853365.06646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853365.06649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853365.06876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853365.06978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853365.06982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853365.06984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853365.07029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853365.07078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853365.07104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853365.07151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853365.07184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853365.07216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853365.07257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853365.07278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853365.07421: variable 'network_connections' from source: play vars 15494 1726853365.07477: variable 'profile' from source: play vars 15494 1726853365.07517: variable 'profile' from source: play vars 15494 1726853365.07527: variable 'interface' from source: set_fact 15494 1726853365.07634: variable 'interface' from source: set_fact 15494 1726853365.07759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15494 1726853365.07873: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15494 1726853365.07900: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15494 1726853365.07928: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15494 1726853365.07949: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15494 1726853365.07984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15494 1726853365.07998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15494 1726853365.08029: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853365.08047: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15494 1726853365.08087: variable '__network_wireless_connections_defined' from source: role '' defaults 15494 1726853365.08242: variable 'network_connections' from source: play vars 15494 1726853365.08247: variable 'profile' from source: play vars 15494 1726853365.08294: variable 'profile' from source: play vars 15494 1726853365.08297: variable 'interface' from source: set_fact 15494 1726853365.08342: variable 'interface' from source: set_fact 15494 1726853365.08367: Evaluated conditional (__network_wpa_supplicant_required): False 15494 1726853365.08372: when evaluation is False, skipping this task 15494 1726853365.08375: _execute() done 15494 1726853365.08386: dumping result to json 15494 1726853365.08389: done dumping result, returning 15494 1726853365.08392: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-0028-1a50-000000000067] 15494 1726853365.08394: sending task result for task 02083763-bbaf-0028-1a50-000000000067 15494 1726853365.08476: done sending task result for task 02083763-bbaf-0028-1a50-000000000067 15494 1726853365.08479: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15494 1726853365.08527: no more pending results, returning what we have 15494 1726853365.08530: results queue empty 15494 1726853365.08531: checking for any_errors_fatal 15494 1726853365.08546: done checking for any_errors_fatal 15494 1726853365.08546: checking for max_fail_percentage 15494 1726853365.08548: done checking for max_fail_percentage 15494 1726853365.08549: checking to see if all hosts have failed and the running result is not ok 15494 1726853365.08550: done checking to see if all hosts have failed 15494 1726853365.08550: getting the remaining hosts for this loop 15494 1726853365.08552: done getting the remaining hosts for this loop 15494 1726853365.08555: getting the next task for host managed_node1 15494 1726853365.08562: done getting next task for host managed_node1 15494 1726853365.08565: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15494 1726853365.08567: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853365.08582: getting variables 15494 1726853365.08584: in VariableManager get_vars() 15494 1726853365.08629: Calling all_inventory to load vars for managed_node1 15494 1726853365.08632: Calling groups_inventory to load vars for managed_node1 15494 1726853365.08634: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853365.08643: Calling all_plugins_play to load vars for managed_node1 15494 1726853365.08646: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853365.08648: Calling groups_plugins_play to load vars for managed_node1 15494 1726853365.09607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853365.11319: done with get_vars() 15494 1726853365.11349: done getting variables 15494 1726853365.11409: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:29:25 -0400 (0:00:00.099) 0:00:33.730 ****** 15494 1726853365.11444: entering _queue_task() for managed_node1/service 15494 1726853365.11786: worker is 1 (out of 1 available) 15494 1726853365.11798: exiting _queue_task() for managed_node1/service 15494 1726853365.11811: done queuing things up, now waiting for results queue to drain 15494 1726853365.11814: waiting for pending results... 15494 1726853365.12238: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 15494 1726853365.12252: in run() - task 02083763-bbaf-0028-1a50-000000000068 15494 1726853365.12255: variable 'ansible_search_path' from source: unknown 15494 1726853365.12258: variable 'ansible_search_path' from source: unknown 15494 1726853365.12365: calling self._execute() 15494 1726853365.12368: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853365.12380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853365.12475: variable 'omit' from source: magic vars 15494 1726853365.12803: variable 'ansible_distribution_major_version' from source: facts 15494 1726853365.12820: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853365.12936: variable 'network_provider' from source: set_fact 15494 1726853365.12944: Evaluated conditional (network_provider == "initscripts"): False 15494 1726853365.12949: when evaluation is False, skipping this task 15494 1726853365.12952: _execute() done 15494 1726853365.12954: dumping result to json 15494 1726853365.12961: done dumping result, returning 15494 1726853365.12965: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-0028-1a50-000000000068] 15494 1726853365.12967: sending task result for task 02083763-bbaf-0028-1a50-000000000068 15494 1726853365.13152: done sending task result for task 02083763-bbaf-0028-1a50-000000000068 15494 1726853365.13155: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15494 1726853365.13217: no more pending results, returning what we have 15494 1726853365.13220: results queue empty 15494 1726853365.13221: checking for any_errors_fatal 15494 1726853365.13226: done checking for any_errors_fatal 15494 1726853365.13226: checking for max_fail_percentage 15494 1726853365.13228: done checking for max_fail_percentage 15494 1726853365.13229: checking to see if all hosts have failed and the running result is not ok 15494 1726853365.13230: done checking to see if all hosts have failed 15494 1726853365.13230: getting the remaining hosts for this loop 15494 1726853365.13231: done getting the remaining hosts for this loop 15494 1726853365.13234: getting the next task for host managed_node1 15494 1726853365.13239: done getting next task for host managed_node1 15494 1726853365.13242: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15494 1726853365.13244: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853365.13263: getting variables 15494 1726853365.13265: in VariableManager get_vars() 15494 1726853365.13299: Calling all_inventory to load vars for managed_node1 15494 1726853365.13302: Calling groups_inventory to load vars for managed_node1 15494 1726853365.13304: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853365.13313: Calling all_plugins_play to load vars for managed_node1 15494 1726853365.13317: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853365.13320: Calling groups_plugins_play to load vars for managed_node1 15494 1726853365.14806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853365.16562: done with get_vars() 15494 1726853365.16589: done getting variables 15494 1726853365.16651: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:29:25 -0400 (0:00:00.052) 0:00:33.782 ****** 15494 1726853365.16686: entering _queue_task() for managed_node1/copy 15494 1726853365.17015: worker is 1 (out of 1 available) 15494 1726853365.17051: exiting _queue_task() for managed_node1/copy 15494 1726853365.17065: done queuing things up, now waiting for results queue to drain 15494 1726853365.17066: waiting for pending results... 15494 1726853365.17769: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15494 1726853365.17777: in run() - task 02083763-bbaf-0028-1a50-000000000069 15494 1726853365.17781: variable 'ansible_search_path' from source: unknown 15494 1726853365.17784: variable 'ansible_search_path' from source: unknown 15494 1726853365.17786: calling self._execute() 15494 1726853365.17899: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853365.17911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853365.17929: variable 'omit' from source: magic vars 15494 1726853365.18497: variable 'ansible_distribution_major_version' from source: facts 15494 1726853365.18541: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853365.18755: variable 'network_provider' from source: set_fact 15494 1726853365.18770: Evaluated conditional (network_provider == "initscripts"): False 15494 1726853365.18780: when evaluation is False, skipping this task 15494 1726853365.18828: _execute() done 15494 1726853365.18832: dumping result to json 15494 1726853365.18834: done dumping result, returning 15494 1726853365.18840: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-0028-1a50-000000000069] 15494 1726853365.18877: sending task result for task 02083763-bbaf-0028-1a50-000000000069 skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15494 1726853365.19143: no more pending results, returning what we have 15494 1726853365.19147: results queue empty 15494 1726853365.19148: checking for any_errors_fatal 15494 1726853365.19158: done checking for any_errors_fatal 15494 1726853365.19158: checking for max_fail_percentage 15494 1726853365.19161: done checking for max_fail_percentage 15494 1726853365.19162: checking to see if all hosts have failed and the running result is not ok 15494 1726853365.19163: done checking to see if all hosts have failed 15494 1726853365.19163: getting the remaining hosts for this loop 15494 1726853365.19165: done getting the remaining hosts for this loop 15494 1726853365.19169: getting the next task for host managed_node1 15494 1726853365.19182: done getting next task for host managed_node1 15494 1726853365.19187: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15494 1726853365.19190: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853365.19207: done sending task result for task 02083763-bbaf-0028-1a50-000000000069 15494 1726853365.19210: WORKER PROCESS EXITING 15494 1726853365.19220: getting variables 15494 1726853365.19222: in VariableManager get_vars() 15494 1726853365.19267: Calling all_inventory to load vars for managed_node1 15494 1726853365.19398: Calling groups_inventory to load vars for managed_node1 15494 1726853365.19402: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853365.19419: Calling all_plugins_play to load vars for managed_node1 15494 1726853365.19425: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853365.19429: Calling groups_plugins_play to load vars for managed_node1 15494 1726853365.26723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853365.28441: done with get_vars() 15494 1726853365.28466: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:29:25 -0400 (0:00:00.118) 0:00:33.901 ****** 15494 1726853365.28557: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15494 1726853365.29018: worker is 1 (out of 1 available) 15494 1726853365.29031: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 15494 1726853365.29044: done queuing things up, now waiting for results queue to drain 15494 1726853365.29046: waiting for pending results... 15494 1726853365.29441: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15494 1726853365.29642: in run() - task 02083763-bbaf-0028-1a50-00000000006a 15494 1726853365.29667: variable 'ansible_search_path' from source: unknown 15494 1726853365.29717: variable 'ansible_search_path' from source: unknown 15494 1726853365.29734: calling self._execute() 15494 1726853365.29864: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853365.29882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853365.29902: variable 'omit' from source: magic vars 15494 1726853365.30376: variable 'ansible_distribution_major_version' from source: facts 15494 1726853365.30380: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853365.30388: variable 'omit' from source: magic vars 15494 1726853365.30426: variable 'omit' from source: magic vars 15494 1726853365.30621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15494 1726853365.32958: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15494 1726853365.33053: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15494 1726853365.33110: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15494 1726853365.33152: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15494 1726853365.33189: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15494 1726853365.33293: variable 'network_provider' from source: set_fact 15494 1726853365.33539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15494 1726853365.33542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15494 1726853365.33545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15494 1726853365.33563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15494 1726853365.33586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15494 1726853365.33670: variable 'omit' from source: magic vars 15494 1726853365.33783: variable 'omit' from source: magic vars 15494 1726853365.33896: variable 'network_connections' from source: play vars 15494 1726853365.33913: variable 'profile' from source: play vars 15494 1726853365.33993: variable 'profile' from source: play vars 15494 1726853365.34007: variable 'interface' from source: set_fact 15494 1726853365.34067: variable 'interface' from source: set_fact 15494 1726853365.34378: variable 'omit' from source: magic vars 15494 1726853365.34381: variable '__lsr_ansible_managed' from source: task vars 15494 1726853365.34384: variable '__lsr_ansible_managed' from source: task vars 15494 1726853365.34510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15494 1726853365.34754: Loaded config def from plugin (lookup/template) 15494 1726853365.34763: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15494 1726853365.34794: File lookup term: get_ansible_managed.j2 15494 1726853365.34802: variable 'ansible_search_path' from source: unknown 15494 1726853365.34825: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15494 1726853365.34848: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15494 1726853365.34870: variable 'ansible_search_path' from source: unknown 15494 1726853365.41459: variable 'ansible_managed' from source: unknown 15494 1726853365.41553: variable 'omit' from source: magic vars 15494 1726853365.41576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853365.41596: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853365.41612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853365.41626: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853365.41635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853365.41662: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853365.41664: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853365.41667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853365.41731: Set connection var ansible_connection to ssh 15494 1726853365.41737: Set connection var ansible_pipelining to False 15494 1726853365.41742: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853365.41746: Set connection var ansible_shell_type to sh 15494 1726853365.41748: Set connection var ansible_timeout to 10 15494 1726853365.41759: Set connection var ansible_shell_executable to /bin/sh 15494 1726853365.41778: variable 'ansible_shell_executable' from source: unknown 15494 1726853365.41781: variable 'ansible_connection' from source: unknown 15494 1726853365.41783: variable 'ansible_module_compression' from source: unknown 15494 1726853365.41786: variable 'ansible_shell_type' from source: unknown 15494 1726853365.41788: variable 'ansible_shell_executable' from source: unknown 15494 1726853365.41790: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853365.41795: variable 'ansible_pipelining' from source: unknown 15494 1726853365.41797: variable 'ansible_timeout' from source: unknown 15494 1726853365.41801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853365.41897: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853365.41908: variable 'omit' from source: magic vars 15494 1726853365.41911: starting attempt loop 15494 1726853365.41914: running the handler 15494 1726853365.41927: _low_level_execute_command(): starting 15494 1726853365.41932: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853365.42423: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853365.42427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853365.42429: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853365.42432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853365.42479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853365.42483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853365.42487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853365.42533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853365.44223: stdout chunk (state=3): >>>/root <<< 15494 1726853365.44319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853365.44347: stderr chunk (state=3): >>><<< 15494 1726853365.44375: stdout chunk (state=3): >>><<< 15494 1726853365.44426: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853365.44429: _low_level_execute_command(): starting 15494 1726853365.44433: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853365.4438562-17025-247094428046089 `" && echo ansible-tmp-1726853365.4438562-17025-247094428046089="` echo /root/.ansible/tmp/ansible-tmp-1726853365.4438562-17025-247094428046089 `" ) && sleep 0' 15494 1726853365.44991: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853365.45007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853365.45077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853365.45095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853365.45116: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853365.45132: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853365.45212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853365.45296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853365.47173: stdout chunk (state=3): >>>ansible-tmp-1726853365.4438562-17025-247094428046089=/root/.ansible/tmp/ansible-tmp-1726853365.4438562-17025-247094428046089 <<< 15494 1726853365.47303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853365.47309: stdout chunk (state=3): >>><<< 15494 1726853365.47320: stderr chunk (state=3): >>><<< 15494 1726853365.47369: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853365.4438562-17025-247094428046089=/root/.ansible/tmp/ansible-tmp-1726853365.4438562-17025-247094428046089 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853365.47374: variable 'ansible_module_compression' from source: unknown 15494 1726853365.47406: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15494 1726853365.47430: variable 'ansible_facts' from source: unknown 15494 1726853365.47501: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853365.4438562-17025-247094428046089/AnsiballZ_network_connections.py 15494 1726853365.47598: Sending initial data 15494 1726853365.47602: Sent initial data (168 bytes) 15494 1726853365.48152: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853365.48167: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853365.48200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853365.48270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853365.49792: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15494 1726853365.49799: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853365.49830: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853365.49873: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpqxzfzwx0 /root/.ansible/tmp/ansible-tmp-1726853365.4438562-17025-247094428046089/AnsiballZ_network_connections.py <<< 15494 1726853365.49876: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853365.4438562-17025-247094428046089/AnsiballZ_network_connections.py" <<< 15494 1726853365.49908: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpqxzfzwx0" to remote "/root/.ansible/tmp/ansible-tmp-1726853365.4438562-17025-247094428046089/AnsiballZ_network_connections.py" <<< 15494 1726853365.49915: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853365.4438562-17025-247094428046089/AnsiballZ_network_connections.py" <<< 15494 1726853365.50593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853365.50631: stderr chunk (state=3): >>><<< 15494 1726853365.50634: stdout chunk (state=3): >>><<< 15494 1726853365.50672: done transferring module to remote 15494 1726853365.50681: _low_level_execute_command(): starting 15494 1726853365.50684: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853365.4438562-17025-247094428046089/ /root/.ansible/tmp/ansible-tmp-1726853365.4438562-17025-247094428046089/AnsiballZ_network_connections.py && sleep 0' 15494 1726853365.51104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853365.51107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853365.51110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853365.51112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853365.51114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853365.51165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853365.51175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853365.51209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853365.52988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853365.53004: stderr chunk (state=3): >>><<< 15494 1726853365.53008: stdout chunk (state=3): >>><<< 15494 1726853365.53042: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853365.53045: _low_level_execute_command(): starting 15494 1726853365.53047: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853365.4438562-17025-247094428046089/AnsiballZ_network_connections.py && sleep 0' 15494 1726853365.53578: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853365.53582: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853365.53585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853365.53623: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853365.53716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853365.80290: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_i1etlunu/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_i1etlunu/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/64b0877f-187d-4c9d-a4e5-a37e4f2875dc: error=unknown <<< 15494 1726853365.80426: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15494 1726853365.82215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853365.82218: stdout chunk (state=3): >>><<< 15494 1726853365.82221: stderr chunk (state=3): >>><<< 15494 1726853365.82240: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_i1etlunu/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_i1etlunu/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/64b0877f-187d-4c9d-a4e5-a37e4f2875dc: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853365.82287: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853365.4438562-17025-247094428046089/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853365.82322: _low_level_execute_command(): starting 15494 1726853365.82325: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853365.4438562-17025-247094428046089/ > /dev/null 2>&1 && sleep 0' 15494 1726853365.82964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853365.82979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853365.82994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853365.83009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853365.83036: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853365.83086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853365.83151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853365.83168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853365.83195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853365.83268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853365.85240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853365.85243: stdout chunk (state=3): >>><<< 15494 1726853365.85246: stderr chunk (state=3): >>><<< 15494 1726853365.85263: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853365.85383: handler run complete 15494 1726853365.85386: attempt loop complete, returning result 15494 1726853365.85388: _execute() done 15494 1726853365.85390: dumping result to json 15494 1726853365.85396: done dumping result, returning 15494 1726853365.85398: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-0028-1a50-00000000006a] 15494 1726853365.85404: sending task result for task 02083763-bbaf-0028-1a50-00000000006a 15494 1726853365.85489: done sending task result for task 02083763-bbaf-0028-1a50-00000000006a 15494 1726853365.85492: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15494 1726853365.85596: no more pending results, returning what we have 15494 1726853365.85600: results queue empty 15494 1726853365.85601: checking for any_errors_fatal 15494 1726853365.85608: done checking for any_errors_fatal 15494 1726853365.85609: checking for max_fail_percentage 15494 1726853365.85610: done checking for max_fail_percentage 15494 1726853365.85611: checking to see if all hosts have failed and the running result is not ok 15494 1726853365.85612: done checking to see if all hosts have failed 15494 1726853365.85613: getting the remaining hosts for this loop 15494 1726853365.85614: done getting the remaining hosts for this loop 15494 1726853365.85675: getting the next task for host managed_node1 15494 1726853365.85682: done getting next task for host managed_node1 15494 1726853365.85686: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15494 1726853365.85688: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853365.85697: getting variables 15494 1726853365.85699: in VariableManager get_vars() 15494 1726853365.85850: Calling all_inventory to load vars for managed_node1 15494 1726853365.85853: Calling groups_inventory to load vars for managed_node1 15494 1726853365.85855: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853365.85863: Calling all_plugins_play to load vars for managed_node1 15494 1726853365.85866: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853365.85869: Calling groups_plugins_play to load vars for managed_node1 15494 1726853365.87250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853365.89103: done with get_vars() 15494 1726853365.89125: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:29:25 -0400 (0:00:00.606) 0:00:34.507 ****** 15494 1726853365.89222: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15494 1726853365.89582: worker is 1 (out of 1 available) 15494 1726853365.89596: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 15494 1726853365.89609: done queuing things up, now waiting for results queue to drain 15494 1726853365.89610: waiting for pending results... 15494 1726853365.89989: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 15494 1726853365.90278: in run() - task 02083763-bbaf-0028-1a50-00000000006b 15494 1726853365.90282: variable 'ansible_search_path' from source: unknown 15494 1726853365.90285: variable 'ansible_search_path' from source: unknown 15494 1726853365.90287: calling self._execute() 15494 1726853365.90290: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853365.90292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853365.90295: variable 'omit' from source: magic vars 15494 1726853365.90676: variable 'ansible_distribution_major_version' from source: facts 15494 1726853365.90680: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853365.90731: variable 'network_state' from source: role '' defaults 15494 1726853365.90739: Evaluated conditional (network_state != {}): False 15494 1726853365.90742: when evaluation is False, skipping this task 15494 1726853365.90753: _execute() done 15494 1726853365.90757: dumping result to json 15494 1726853365.90762: done dumping result, returning 15494 1726853365.90769: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-0028-1a50-00000000006b] 15494 1726853365.90774: sending task result for task 02083763-bbaf-0028-1a50-00000000006b 15494 1726853365.90863: done sending task result for task 02083763-bbaf-0028-1a50-00000000006b 15494 1726853365.90866: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15494 1726853365.90917: no more pending results, returning what we have 15494 1726853365.90922: results queue empty 15494 1726853365.90922: checking for any_errors_fatal 15494 1726853365.90932: done checking for any_errors_fatal 15494 1726853365.90933: checking for max_fail_percentage 15494 1726853365.90935: done checking for max_fail_percentage 15494 1726853365.90936: checking to see if all hosts have failed and the running result is not ok 15494 1726853365.90937: done checking to see if all hosts have failed 15494 1726853365.90938: getting the remaining hosts for this loop 15494 1726853365.90939: done getting the remaining hosts for this loop 15494 1726853365.90943: getting the next task for host managed_node1 15494 1726853365.90951: done getting next task for host managed_node1 15494 1726853365.90955: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15494 1726853365.90958: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853365.90981: getting variables 15494 1726853365.90983: in VariableManager get_vars() 15494 1726853365.91022: Calling all_inventory to load vars for managed_node1 15494 1726853365.91026: Calling groups_inventory to load vars for managed_node1 15494 1726853365.91028: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853365.91040: Calling all_plugins_play to load vars for managed_node1 15494 1726853365.91043: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853365.91046: Calling groups_plugins_play to load vars for managed_node1 15494 1726853365.92634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853365.94316: done with get_vars() 15494 1726853365.94351: done getting variables 15494 1726853365.94417: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:29:25 -0400 (0:00:00.052) 0:00:34.560 ****** 15494 1726853365.94459: entering _queue_task() for managed_node1/debug 15494 1726853365.94997: worker is 1 (out of 1 available) 15494 1726853365.95008: exiting _queue_task() for managed_node1/debug 15494 1726853365.95018: done queuing things up, now waiting for results queue to drain 15494 1726853365.95019: waiting for pending results... 15494 1726853365.95133: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15494 1726853365.95241: in run() - task 02083763-bbaf-0028-1a50-00000000006c 15494 1726853365.95278: variable 'ansible_search_path' from source: unknown 15494 1726853365.95282: variable 'ansible_search_path' from source: unknown 15494 1726853365.95316: calling self._execute() 15494 1726853365.95424: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853365.95428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853365.95576: variable 'omit' from source: magic vars 15494 1726853365.95845: variable 'ansible_distribution_major_version' from source: facts 15494 1726853365.95856: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853365.95862: variable 'omit' from source: magic vars 15494 1726853365.95911: variable 'omit' from source: magic vars 15494 1726853365.95954: variable 'omit' from source: magic vars 15494 1726853365.95996: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853365.96043: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853365.96064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853365.96082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853365.96095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853365.96135: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853365.96140: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853365.96142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853365.96256: Set connection var ansible_connection to ssh 15494 1726853365.96262: Set connection var ansible_pipelining to False 15494 1726853365.96268: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853365.96273: Set connection var ansible_shell_type to sh 15494 1726853365.96276: Set connection var ansible_timeout to 10 15494 1726853365.96285: Set connection var ansible_shell_executable to /bin/sh 15494 1726853365.96308: variable 'ansible_shell_executable' from source: unknown 15494 1726853365.96312: variable 'ansible_connection' from source: unknown 15494 1726853365.96315: variable 'ansible_module_compression' from source: unknown 15494 1726853365.96318: variable 'ansible_shell_type' from source: unknown 15494 1726853365.96320: variable 'ansible_shell_executable' from source: unknown 15494 1726853365.96323: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853365.96325: variable 'ansible_pipelining' from source: unknown 15494 1726853365.96378: variable 'ansible_timeout' from source: unknown 15494 1726853365.96382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853365.96488: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853365.96556: variable 'omit' from source: magic vars 15494 1726853365.96560: starting attempt loop 15494 1726853365.96562: running the handler 15494 1726853365.96636: variable '__network_connections_result' from source: set_fact 15494 1726853365.96696: handler run complete 15494 1726853365.96712: attempt loop complete, returning result 15494 1726853365.96715: _execute() done 15494 1726853365.96718: dumping result to json 15494 1726853365.96720: done dumping result, returning 15494 1726853365.96729: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-0028-1a50-00000000006c] 15494 1726853365.96733: sending task result for task 02083763-bbaf-0028-1a50-00000000006c 15494 1726853365.96924: done sending task result for task 02083763-bbaf-0028-1a50-00000000006c 15494 1726853365.96928: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 15494 1726853365.96984: no more pending results, returning what we have 15494 1726853365.96988: results queue empty 15494 1726853365.96988: checking for any_errors_fatal 15494 1726853365.96994: done checking for any_errors_fatal 15494 1726853365.96995: checking for max_fail_percentage 15494 1726853365.96996: done checking for max_fail_percentage 15494 1726853365.96997: checking to see if all hosts have failed and the running result is not ok 15494 1726853365.96998: done checking to see if all hosts have failed 15494 1726853365.96999: getting the remaining hosts for this loop 15494 1726853365.97000: done getting the remaining hosts for this loop 15494 1726853365.97004: getting the next task for host managed_node1 15494 1726853365.97009: done getting next task for host managed_node1 15494 1726853365.97012: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15494 1726853365.97014: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853365.97023: getting variables 15494 1726853365.97026: in VariableManager get_vars() 15494 1726853365.97062: Calling all_inventory to load vars for managed_node1 15494 1726853365.97065: Calling groups_inventory to load vars for managed_node1 15494 1726853365.97068: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853365.97079: Calling all_plugins_play to load vars for managed_node1 15494 1726853365.97082: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853365.97084: Calling groups_plugins_play to load vars for managed_node1 15494 1726853365.98779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853366.00398: done with get_vars() 15494 1726853366.00421: done getting variables 15494 1726853366.00492: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:29:26 -0400 (0:00:00.060) 0:00:34.621 ****** 15494 1726853366.00522: entering _queue_task() for managed_node1/debug 15494 1726853366.00874: worker is 1 (out of 1 available) 15494 1726853366.00885: exiting _queue_task() for managed_node1/debug 15494 1726853366.01083: done queuing things up, now waiting for results queue to drain 15494 1726853366.01085: waiting for pending results... 15494 1726853366.01288: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15494 1726853366.01294: in run() - task 02083763-bbaf-0028-1a50-00000000006d 15494 1726853366.01296: variable 'ansible_search_path' from source: unknown 15494 1726853366.01300: variable 'ansible_search_path' from source: unknown 15494 1726853366.01349: calling self._execute() 15494 1726853366.01458: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853366.01463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853366.01576: variable 'omit' from source: magic vars 15494 1726853366.01869: variable 'ansible_distribution_major_version' from source: facts 15494 1726853366.01881: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853366.01887: variable 'omit' from source: magic vars 15494 1726853366.01925: variable 'omit' from source: magic vars 15494 1726853366.01959: variable 'omit' from source: magic vars 15494 1726853366.02008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853366.02042: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853366.02061: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853366.02089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853366.02102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853366.02131: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853366.02135: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853366.02138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853366.02246: Set connection var ansible_connection to ssh 15494 1726853366.02251: Set connection var ansible_pipelining to False 15494 1726853366.02256: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853366.02259: Set connection var ansible_shell_type to sh 15494 1726853366.02380: Set connection var ansible_timeout to 10 15494 1726853366.02383: Set connection var ansible_shell_executable to /bin/sh 15494 1726853366.02386: variable 'ansible_shell_executable' from source: unknown 15494 1726853366.02388: variable 'ansible_connection' from source: unknown 15494 1726853366.02390: variable 'ansible_module_compression' from source: unknown 15494 1726853366.02392: variable 'ansible_shell_type' from source: unknown 15494 1726853366.02394: variable 'ansible_shell_executable' from source: unknown 15494 1726853366.02396: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853366.02398: variable 'ansible_pipelining' from source: unknown 15494 1726853366.02400: variable 'ansible_timeout' from source: unknown 15494 1726853366.02402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853366.02465: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853366.02477: variable 'omit' from source: magic vars 15494 1726853366.02483: starting attempt loop 15494 1726853366.02486: running the handler 15494 1726853366.02544: variable '__network_connections_result' from source: set_fact 15494 1726853366.02614: variable '__network_connections_result' from source: set_fact 15494 1726853366.02720: handler run complete 15494 1726853366.02753: attempt loop complete, returning result 15494 1726853366.02756: _execute() done 15494 1726853366.02758: dumping result to json 15494 1726853366.02761: done dumping result, returning 15494 1726853366.02770: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-0028-1a50-00000000006d] 15494 1726853366.02774: sending task result for task 02083763-bbaf-0028-1a50-00000000006d 15494 1726853366.02863: done sending task result for task 02083763-bbaf-0028-1a50-00000000006d 15494 1726853366.02867: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15494 1726853366.03075: no more pending results, returning what we have 15494 1726853366.03080: results queue empty 15494 1726853366.03081: checking for any_errors_fatal 15494 1726853366.03086: done checking for any_errors_fatal 15494 1726853366.03087: checking for max_fail_percentage 15494 1726853366.03089: done checking for max_fail_percentage 15494 1726853366.03089: checking to see if all hosts have failed and the running result is not ok 15494 1726853366.03090: done checking to see if all hosts have failed 15494 1726853366.03091: getting the remaining hosts for this loop 15494 1726853366.03093: done getting the remaining hosts for this loop 15494 1726853366.03096: getting the next task for host managed_node1 15494 1726853366.03102: done getting next task for host managed_node1 15494 1726853366.03106: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15494 1726853366.03108: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853366.03117: getting variables 15494 1726853366.03118: in VariableManager get_vars() 15494 1726853366.03152: Calling all_inventory to load vars for managed_node1 15494 1726853366.03154: Calling groups_inventory to load vars for managed_node1 15494 1726853366.03157: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853366.03165: Calling all_plugins_play to load vars for managed_node1 15494 1726853366.03169: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853366.03178: Calling groups_plugins_play to load vars for managed_node1 15494 1726853366.04610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853366.06321: done with get_vars() 15494 1726853366.06349: done getting variables 15494 1726853366.06412: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:29:26 -0400 (0:00:00.059) 0:00:34.680 ****** 15494 1726853366.06453: entering _queue_task() for managed_node1/debug 15494 1726853366.06831: worker is 1 (out of 1 available) 15494 1726853366.06843: exiting _queue_task() for managed_node1/debug 15494 1726853366.06854: done queuing things up, now waiting for results queue to drain 15494 1726853366.06856: waiting for pending results... 15494 1726853366.07193: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15494 1726853366.07278: in run() - task 02083763-bbaf-0028-1a50-00000000006e 15494 1726853366.07284: variable 'ansible_search_path' from source: unknown 15494 1726853366.07288: variable 'ansible_search_path' from source: unknown 15494 1726853366.07290: calling self._execute() 15494 1726853366.07406: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853366.07410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853366.07413: variable 'omit' from source: magic vars 15494 1726853366.07774: variable 'ansible_distribution_major_version' from source: facts 15494 1726853366.07788: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853366.07951: variable 'network_state' from source: role '' defaults 15494 1726853366.07955: Evaluated conditional (network_state != {}): False 15494 1726853366.07958: when evaluation is False, skipping this task 15494 1726853366.07960: _execute() done 15494 1726853366.07962: dumping result to json 15494 1726853366.07964: done dumping result, returning 15494 1726853366.07966: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-0028-1a50-00000000006e] 15494 1726853366.07969: sending task result for task 02083763-bbaf-0028-1a50-00000000006e 15494 1726853366.08042: done sending task result for task 02083763-bbaf-0028-1a50-00000000006e 15494 1726853366.08046: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 15494 1726853366.08205: no more pending results, returning what we have 15494 1726853366.08209: results queue empty 15494 1726853366.08210: checking for any_errors_fatal 15494 1726853366.08218: done checking for any_errors_fatal 15494 1726853366.08219: checking for max_fail_percentage 15494 1726853366.08221: done checking for max_fail_percentage 15494 1726853366.08222: checking to see if all hosts have failed and the running result is not ok 15494 1726853366.08223: done checking to see if all hosts have failed 15494 1726853366.08225: getting the remaining hosts for this loop 15494 1726853366.08227: done getting the remaining hosts for this loop 15494 1726853366.08231: getting the next task for host managed_node1 15494 1726853366.08237: done getting next task for host managed_node1 15494 1726853366.08241: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15494 1726853366.08245: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853366.08258: getting variables 15494 1726853366.08260: in VariableManager get_vars() 15494 1726853366.08300: Calling all_inventory to load vars for managed_node1 15494 1726853366.08302: Calling groups_inventory to load vars for managed_node1 15494 1726853366.08305: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853366.08315: Calling all_plugins_play to load vars for managed_node1 15494 1726853366.08318: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853366.08321: Calling groups_plugins_play to load vars for managed_node1 15494 1726853366.10001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853366.11603: done with get_vars() 15494 1726853366.11624: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:29:26 -0400 (0:00:00.052) 0:00:34.732 ****** 15494 1726853366.11719: entering _queue_task() for managed_node1/ping 15494 1726853366.12060: worker is 1 (out of 1 available) 15494 1726853366.12189: exiting _queue_task() for managed_node1/ping 15494 1726853366.12200: done queuing things up, now waiting for results queue to drain 15494 1726853366.12201: waiting for pending results... 15494 1726853366.12487: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 15494 1726853366.12492: in run() - task 02083763-bbaf-0028-1a50-00000000006f 15494 1726853366.12504: variable 'ansible_search_path' from source: unknown 15494 1726853366.12507: variable 'ansible_search_path' from source: unknown 15494 1726853366.12556: calling self._execute() 15494 1726853366.12675: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853366.12679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853366.12682: variable 'omit' from source: magic vars 15494 1726853366.13276: variable 'ansible_distribution_major_version' from source: facts 15494 1726853366.13281: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853366.13284: variable 'omit' from source: magic vars 15494 1726853366.13287: variable 'omit' from source: magic vars 15494 1726853366.13290: variable 'omit' from source: magic vars 15494 1726853366.13293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853366.13298: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853366.13306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853366.13325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853366.13336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853366.13364: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853366.13368: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853366.13370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853366.13483: Set connection var ansible_connection to ssh 15494 1726853366.13489: Set connection var ansible_pipelining to False 15494 1726853366.13499: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853366.13502: Set connection var ansible_shell_type to sh 15494 1726853366.13512: Set connection var ansible_timeout to 10 15494 1726853366.13523: Set connection var ansible_shell_executable to /bin/sh 15494 1726853366.13546: variable 'ansible_shell_executable' from source: unknown 15494 1726853366.13552: variable 'ansible_connection' from source: unknown 15494 1726853366.13555: variable 'ansible_module_compression' from source: unknown 15494 1726853366.13557: variable 'ansible_shell_type' from source: unknown 15494 1726853366.13559: variable 'ansible_shell_executable' from source: unknown 15494 1726853366.13561: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853366.13564: variable 'ansible_pipelining' from source: unknown 15494 1726853366.13566: variable 'ansible_timeout' from source: unknown 15494 1726853366.13568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853366.13777: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853366.13789: variable 'omit' from source: magic vars 15494 1726853366.13792: starting attempt loop 15494 1726853366.13795: running the handler 15494 1726853366.13807: _low_level_execute_command(): starting 15494 1726853366.13820: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853366.14529: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853366.14617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853366.14656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853366.16358: stdout chunk (state=3): >>>/root <<< 15494 1726853366.16517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853366.16521: stdout chunk (state=3): >>><<< 15494 1726853366.16523: stderr chunk (state=3): >>><<< 15494 1726853366.16557: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853366.16661: _low_level_execute_command(): starting 15494 1726853366.16665: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853366.1656446-17048-227917697663663 `" && echo ansible-tmp-1726853366.1656446-17048-227917697663663="` echo /root/.ansible/tmp/ansible-tmp-1726853366.1656446-17048-227917697663663 `" ) && sleep 0' 15494 1726853366.17256: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853366.17269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853366.17287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853366.17303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853366.17329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853366.17339: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853366.17445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853366.17469: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853366.17490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853366.17566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853366.19433: stdout chunk (state=3): >>>ansible-tmp-1726853366.1656446-17048-227917697663663=/root/.ansible/tmp/ansible-tmp-1726853366.1656446-17048-227917697663663 <<< 15494 1726853366.19586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853366.19596: stdout chunk (state=3): >>><<< 15494 1726853366.19606: stderr chunk (state=3): >>><<< 15494 1726853366.19628: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853366.1656446-17048-227917697663663=/root/.ansible/tmp/ansible-tmp-1726853366.1656446-17048-227917697663663 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853366.19776: variable 'ansible_module_compression' from source: unknown 15494 1726853366.19779: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15494 1726853366.19783: variable 'ansible_facts' from source: unknown 15494 1726853366.19878: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853366.1656446-17048-227917697663663/AnsiballZ_ping.py 15494 1726853366.20027: Sending initial data 15494 1726853366.20085: Sent initial data (153 bytes) 15494 1726853366.20710: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853366.20725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853366.20789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853366.20860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853366.20884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853366.20952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853366.22510: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853366.22576: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853366.22637: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpo_a7_hbh /root/.ansible/tmp/ansible-tmp-1726853366.1656446-17048-227917697663663/AnsiballZ_ping.py <<< 15494 1726853366.22641: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853366.1656446-17048-227917697663663/AnsiballZ_ping.py" <<< 15494 1726853366.22676: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpo_a7_hbh" to remote "/root/.ansible/tmp/ansible-tmp-1726853366.1656446-17048-227917697663663/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853366.1656446-17048-227917697663663/AnsiballZ_ping.py" <<< 15494 1726853366.23453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853366.23502: stderr chunk (state=3): >>><<< 15494 1726853366.23531: stdout chunk (state=3): >>><<< 15494 1726853366.23542: done transferring module to remote 15494 1726853366.23632: _low_level_execute_command(): starting 15494 1726853366.23642: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853366.1656446-17048-227917697663663/ /root/.ansible/tmp/ansible-tmp-1726853366.1656446-17048-227917697663663/AnsiballZ_ping.py && sleep 0' 15494 1726853366.24199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853366.24231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853366.24266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853366.26288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853366.26292: stdout chunk (state=3): >>><<< 15494 1726853366.26296: stderr chunk (state=3): >>><<< 15494 1726853366.26323: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853366.26397: _low_level_execute_command(): starting 15494 1726853366.26400: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853366.1656446-17048-227917697663663/AnsiballZ_ping.py && sleep 0' 15494 1726853366.27046: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853366.27064: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853366.27168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853366.27201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853366.27222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853366.27238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853366.27321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853366.42378: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15494 1726853366.43891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853366.43895: stdout chunk (state=3): >>><<< 15494 1726853366.43897: stderr chunk (state=3): >>><<< 15494 1726853366.43900: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853366.43903: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853366.1656446-17048-227917697663663/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853366.43906: _low_level_execute_command(): starting 15494 1726853366.43909: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853366.1656446-17048-227917697663663/ > /dev/null 2>&1 && sleep 0' 15494 1726853366.44494: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853366.44510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853366.44526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853366.44632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853366.44653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853366.44673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853366.44852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853366.44916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853366.46799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853366.46809: stdout chunk (state=3): >>><<< 15494 1726853366.46827: stderr chunk (state=3): >>><<< 15494 1726853366.46857: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853366.46870: handler run complete 15494 1726853366.46895: attempt loop complete, returning result 15494 1726853366.46903: _execute() done 15494 1726853366.46913: dumping result to json 15494 1726853366.46977: done dumping result, returning 15494 1726853366.46980: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-0028-1a50-00000000006f] 15494 1726853366.46982: sending task result for task 02083763-bbaf-0028-1a50-00000000006f 15494 1726853366.47068: done sending task result for task 02083763-bbaf-0028-1a50-00000000006f 15494 1726853366.47078: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 15494 1726853366.47150: no more pending results, returning what we have 15494 1726853366.47155: results queue empty 15494 1726853366.47156: checking for any_errors_fatal 15494 1726853366.47166: done checking for any_errors_fatal 15494 1726853366.47167: checking for max_fail_percentage 15494 1726853366.47169: done checking for max_fail_percentage 15494 1726853366.47170: checking to see if all hosts have failed and the running result is not ok 15494 1726853366.47295: done checking to see if all hosts have failed 15494 1726853366.47296: getting the remaining hosts for this loop 15494 1726853366.47298: done getting the remaining hosts for this loop 15494 1726853366.47302: getting the next task for host managed_node1 15494 1726853366.47311: done getting next task for host managed_node1 15494 1726853366.47315: ^ task is: TASK: meta (role_complete) 15494 1726853366.47317: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853366.47327: getting variables 15494 1726853366.47330: in VariableManager get_vars() 15494 1726853366.47508: Calling all_inventory to load vars for managed_node1 15494 1726853366.47511: Calling groups_inventory to load vars for managed_node1 15494 1726853366.47514: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853366.47524: Calling all_plugins_play to load vars for managed_node1 15494 1726853366.47527: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853366.47531: Calling groups_plugins_play to load vars for managed_node1 15494 1726853366.49144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853366.51900: done with get_vars() 15494 1726853366.51933: done getting variables 15494 1726853366.52023: done queuing things up, now waiting for results queue to drain 15494 1726853366.52025: results queue empty 15494 1726853366.52026: checking for any_errors_fatal 15494 1726853366.52029: done checking for any_errors_fatal 15494 1726853366.52030: checking for max_fail_percentage 15494 1726853366.52035: done checking for max_fail_percentage 15494 1726853366.52036: checking to see if all hosts have failed and the running result is not ok 15494 1726853366.52037: done checking to see if all hosts have failed 15494 1726853366.52038: getting the remaining hosts for this loop 15494 1726853366.52039: done getting the remaining hosts for this loop 15494 1726853366.52042: getting the next task for host managed_node1 15494 1726853366.52046: done getting next task for host managed_node1 15494 1726853366.52050: ^ task is: TASK: meta (flush_handlers) 15494 1726853366.52052: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853366.52055: getting variables 15494 1726853366.52056: in VariableManager get_vars() 15494 1726853366.52069: Calling all_inventory to load vars for managed_node1 15494 1726853366.52076: Calling groups_inventory to load vars for managed_node1 15494 1726853366.52078: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853366.52083: Calling all_plugins_play to load vars for managed_node1 15494 1726853366.52086: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853366.52090: Calling groups_plugins_play to load vars for managed_node1 15494 1726853366.53419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853366.56598: done with get_vars() 15494 1726853366.56637: done getting variables 15494 1726853366.56696: in VariableManager get_vars() 15494 1726853366.56710: Calling all_inventory to load vars for managed_node1 15494 1726853366.56712: Calling groups_inventory to load vars for managed_node1 15494 1726853366.56714: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853366.56720: Calling all_plugins_play to load vars for managed_node1 15494 1726853366.56722: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853366.56734: Calling groups_plugins_play to load vars for managed_node1 15494 1726853366.58004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853366.60589: done with get_vars() 15494 1726853366.60623: done queuing things up, now waiting for results queue to drain 15494 1726853366.60626: results queue empty 15494 1726853366.60627: checking for any_errors_fatal 15494 1726853366.60628: done checking for any_errors_fatal 15494 1726853366.60629: checking for max_fail_percentage 15494 1726853366.60630: done checking for max_fail_percentage 15494 1726853366.60631: checking to see if all hosts have failed and the running result is not ok 15494 1726853366.60631: done checking to see if all hosts have failed 15494 1726853366.60632: getting the remaining hosts for this loop 15494 1726853366.60633: done getting the remaining hosts for this loop 15494 1726853366.60636: getting the next task for host managed_node1 15494 1726853366.60640: done getting next task for host managed_node1 15494 1726853366.60641: ^ task is: TASK: meta (flush_handlers) 15494 1726853366.60643: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853366.60646: getting variables 15494 1726853366.60647: in VariableManager get_vars() 15494 1726853366.60661: Calling all_inventory to load vars for managed_node1 15494 1726853366.60663: Calling groups_inventory to load vars for managed_node1 15494 1726853366.60665: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853366.60673: Calling all_plugins_play to load vars for managed_node1 15494 1726853366.60676: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853366.60679: Calling groups_plugins_play to load vars for managed_node1 15494 1726853366.61921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853366.64375: done with get_vars() 15494 1726853366.64398: done getting variables 15494 1726853366.64463: in VariableManager get_vars() 15494 1726853366.64478: Calling all_inventory to load vars for managed_node1 15494 1726853366.64481: Calling groups_inventory to load vars for managed_node1 15494 1726853366.64483: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853366.64487: Calling all_plugins_play to load vars for managed_node1 15494 1726853366.64490: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853366.64493: Calling groups_plugins_play to load vars for managed_node1 15494 1726853366.65865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853366.67702: done with get_vars() 15494 1726853366.67728: done queuing things up, now waiting for results queue to drain 15494 1726853366.67730: results queue empty 15494 1726853366.67731: checking for any_errors_fatal 15494 1726853366.67732: done checking for any_errors_fatal 15494 1726853366.67733: checking for max_fail_percentage 15494 1726853366.67733: done checking for max_fail_percentage 15494 1726853366.67734: checking to see if all hosts have failed and the running result is not ok 15494 1726853366.67735: done checking to see if all hosts have failed 15494 1726853366.67736: getting the remaining hosts for this loop 15494 1726853366.67741: done getting the remaining hosts for this loop 15494 1726853366.67744: getting the next task for host managed_node1 15494 1726853366.67751: done getting next task for host managed_node1 15494 1726853366.67752: ^ task is: None 15494 1726853366.67753: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853366.67754: done queuing things up, now waiting for results queue to drain 15494 1726853366.67755: results queue empty 15494 1726853366.67756: checking for any_errors_fatal 15494 1726853366.67756: done checking for any_errors_fatal 15494 1726853366.67757: checking for max_fail_percentage 15494 1726853366.67758: done checking for max_fail_percentage 15494 1726853366.67759: checking to see if all hosts have failed and the running result is not ok 15494 1726853366.67759: done checking to see if all hosts have failed 15494 1726853366.67760: getting the next task for host managed_node1 15494 1726853366.67762: done getting next task for host managed_node1 15494 1726853366.67763: ^ task is: None 15494 1726853366.67764: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853366.67812: in VariableManager get_vars() 15494 1726853366.67829: done with get_vars() 15494 1726853366.67835: in VariableManager get_vars() 15494 1726853366.67844: done with get_vars() 15494 1726853366.67850: variable 'omit' from source: magic vars 15494 1726853366.67982: variable 'task' from source: play vars 15494 1726853366.68021: in VariableManager get_vars() 15494 1726853366.68033: done with get_vars() 15494 1726853366.68055: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_absent.yml] ************************ 15494 1726853366.68357: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15494 1726853366.68432: getting the remaining hosts for this loop 15494 1726853366.68434: done getting the remaining hosts for this loop 15494 1726853366.68436: getting the next task for host managed_node1 15494 1726853366.68439: done getting next task for host managed_node1 15494 1726853366.68441: ^ task is: TASK: Gathering Facts 15494 1726853366.68443: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853366.68445: getting variables 15494 1726853366.68445: in VariableManager get_vars() 15494 1726853366.68456: Calling all_inventory to load vars for managed_node1 15494 1726853366.68459: Calling groups_inventory to load vars for managed_node1 15494 1726853366.68461: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853366.68467: Calling all_plugins_play to load vars for managed_node1 15494 1726853366.68469: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853366.68475: Calling groups_plugins_play to load vars for managed_node1 15494 1726853366.69798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853366.71539: done with get_vars() 15494 1726853366.71563: done getting variables 15494 1726853366.71616: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 13:29:26 -0400 (0:00:00.599) 0:00:35.332 ****** 15494 1726853366.71641: entering _queue_task() for managed_node1/gather_facts 15494 1726853366.72085: worker is 1 (out of 1 available) 15494 1726853366.72096: exiting _queue_task() for managed_node1/gather_facts 15494 1726853366.72109: done queuing things up, now waiting for results queue to drain 15494 1726853366.72110: waiting for pending results... 15494 1726853366.72460: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15494 1726853366.72525: in run() - task 02083763-bbaf-0028-1a50-00000000046e 15494 1726853366.72547: variable 'ansible_search_path' from source: unknown 15494 1726853366.72612: calling self._execute() 15494 1726853366.72710: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853366.72714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853366.72739: variable 'omit' from source: magic vars 15494 1726853366.73207: variable 'ansible_distribution_major_version' from source: facts 15494 1726853366.73210: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853366.73213: variable 'omit' from source: magic vars 15494 1726853366.73215: variable 'omit' from source: magic vars 15494 1726853366.73218: variable 'omit' from source: magic vars 15494 1726853366.73248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853366.73291: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853366.73334: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853366.73349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853366.73355: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853366.73381: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853366.73384: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853366.73391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853366.73485: Set connection var ansible_connection to ssh 15494 1726853366.73489: Set connection var ansible_pipelining to False 15494 1726853366.73495: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853366.73498: Set connection var ansible_shell_type to sh 15494 1726853366.73503: Set connection var ansible_timeout to 10 15494 1726853366.73509: Set connection var ansible_shell_executable to /bin/sh 15494 1726853366.73546: variable 'ansible_shell_executable' from source: unknown 15494 1726853366.73549: variable 'ansible_connection' from source: unknown 15494 1726853366.73552: variable 'ansible_module_compression' from source: unknown 15494 1726853366.73554: variable 'ansible_shell_type' from source: unknown 15494 1726853366.73556: variable 'ansible_shell_executable' from source: unknown 15494 1726853366.73558: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853366.73564: variable 'ansible_pipelining' from source: unknown 15494 1726853366.73675: variable 'ansible_timeout' from source: unknown 15494 1726853366.73678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853366.73750: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853366.73767: variable 'omit' from source: magic vars 15494 1726853366.73780: starting attempt loop 15494 1726853366.73787: running the handler 15494 1726853366.73810: variable 'ansible_facts' from source: unknown 15494 1726853366.73834: _low_level_execute_command(): starting 15494 1726853366.73847: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853366.74697: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853366.74729: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853366.74746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853366.74766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853366.74873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853366.74921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853366.74964: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853366.76647: stdout chunk (state=3): >>>/root <<< 15494 1726853366.76776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853366.76784: stdout chunk (state=3): >>><<< 15494 1726853366.76793: stderr chunk (state=3): >>><<< 15494 1726853366.76811: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853366.76824: _low_level_execute_command(): starting 15494 1726853366.76831: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853366.7681608-17078-193526800592866 `" && echo ansible-tmp-1726853366.7681608-17078-193526800592866="` echo /root/.ansible/tmp/ansible-tmp-1726853366.7681608-17078-193526800592866 `" ) && sleep 0' 15494 1726853366.77239: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853366.77265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853366.77291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853366.77341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853366.77374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853366.77530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853366.79401: stdout chunk (state=3): >>>ansible-tmp-1726853366.7681608-17078-193526800592866=/root/.ansible/tmp/ansible-tmp-1726853366.7681608-17078-193526800592866 <<< 15494 1726853366.79510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853366.79538: stderr chunk (state=3): >>><<< 15494 1726853366.79544: stdout chunk (state=3): >>><<< 15494 1726853366.79562: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853366.7681608-17078-193526800592866=/root/.ansible/tmp/ansible-tmp-1726853366.7681608-17078-193526800592866 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853366.79591: variable 'ansible_module_compression' from source: unknown 15494 1726853366.79637: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15494 1726853366.79689: variable 'ansible_facts' from source: unknown 15494 1726853366.79819: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853366.7681608-17078-193526800592866/AnsiballZ_setup.py 15494 1726853366.79923: Sending initial data 15494 1726853366.79927: Sent initial data (154 bytes) 15494 1726853366.80368: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853366.80373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853366.80375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853366.80378: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853366.80433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853366.80440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853366.80499: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853366.82060: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853366.82103: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853366.82144: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpi8prq0gj /root/.ansible/tmp/ansible-tmp-1726853366.7681608-17078-193526800592866/AnsiballZ_setup.py <<< 15494 1726853366.82158: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853366.7681608-17078-193526800592866/AnsiballZ_setup.py" <<< 15494 1726853366.82195: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 15494 1726853366.82220: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpi8prq0gj" to remote "/root/.ansible/tmp/ansible-tmp-1726853366.7681608-17078-193526800592866/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853366.7681608-17078-193526800592866/AnsiballZ_setup.py" <<< 15494 1726853366.83338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853366.83376: stderr chunk (state=3): >>><<< 15494 1726853366.83379: stdout chunk (state=3): >>><<< 15494 1726853366.83396: done transferring module to remote 15494 1726853366.83404: _low_level_execute_command(): starting 15494 1726853366.83409: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853366.7681608-17078-193526800592866/ /root/.ansible/tmp/ansible-tmp-1726853366.7681608-17078-193526800592866/AnsiballZ_setup.py && sleep 0' 15494 1726853366.83832: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853366.83835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853366.83838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15494 1726853366.83840: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853366.83842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853366.83894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853366.83897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853366.83941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853366.85676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853366.85700: stderr chunk (state=3): >>><<< 15494 1726853366.85703: stdout chunk (state=3): >>><<< 15494 1726853366.85717: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853366.85720: _low_level_execute_command(): starting 15494 1726853366.85725: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853366.7681608-17078-193526800592866/AnsiballZ_setup.py && sleep 0' 15494 1726853366.86130: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853366.86133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853366.86136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15494 1726853366.86139: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853366.86141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853366.86189: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853366.86193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853366.86238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853367.48880: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.5166015625, "5m": 0.35498046875, "15m": 0.1591796875}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "27", "epoch": "1726853367", "epoch_int": "1726853367", "date": "2024-09-20", "time": "13:29:27", "iso8601_micro": "2024-09-20T17:29:27.138242Z", "iso8601": "2024-09-20T17:29:27Z", "iso8601_basic": "20240920T132927138242", "iso8601_basic_short": "20240920T132927", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2961, "ansible_swaptota<<< 15494 1726853367.48885: stdout chunk (state=3): >>>l_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 570, "free": 2961}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 533, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795016704, "block_size": 4096, "block_total": 65519099, "block_available": 63914799, "block_used": 1604300, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_ha<<< 15494 1726853367.48913: stdout chunk (state=3): >>>shing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15494 1726853367.50842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853367.50846: stdout chunk (state=3): >>><<< 15494 1726853367.50852: stderr chunk (state=3): >>><<< 15494 1726853367.51085: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.5166015625, "5m": 0.35498046875, "15m": 0.1591796875}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "27", "epoch": "1726853367", "epoch_int": "1726853367", "date": "2024-09-20", "time": "13:29:27", "iso8601_micro": "2024-09-20T17:29:27.138242Z", "iso8601": "2024-09-20T17:29:27Z", "iso8601_basic": "20240920T132927138242", "iso8601_basic_short": "20240920T132927", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2961, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 570, "free": 2961}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 533, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261795016704, "block_size": 4096, "block_total": 65519099, "block_available": 63914799, "block_used": 1604300, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853367.51286: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853366.7681608-17078-193526800592866/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853367.51322: _low_level_execute_command(): starting 15494 1726853367.51332: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853366.7681608-17078-193526800592866/ > /dev/null 2>&1 && sleep 0' 15494 1726853367.52005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853367.52020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853367.52034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853367.52152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853367.52177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853367.52202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853367.52290: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853367.54087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853367.54159: stderr chunk (state=3): >>><<< 15494 1726853367.54178: stdout chunk (state=3): >>><<< 15494 1726853367.54198: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853367.54211: handler run complete 15494 1726853367.54334: variable 'ansible_facts' from source: unknown 15494 1726853367.54596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853367.54895: variable 'ansible_facts' from source: unknown 15494 1726853367.54980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853367.55137: attempt loop complete, returning result 15494 1726853367.55146: _execute() done 15494 1726853367.55156: dumping result to json 15494 1726853367.55194: done dumping result, returning 15494 1726853367.55215: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-0028-1a50-00000000046e] 15494 1726853367.55225: sending task result for task 02083763-bbaf-0028-1a50-00000000046e 15494 1726853367.55981: done sending task result for task 02083763-bbaf-0028-1a50-00000000046e 15494 1726853367.55985: WORKER PROCESS EXITING ok: [managed_node1] 15494 1726853367.56527: no more pending results, returning what we have 15494 1726853367.56531: results queue empty 15494 1726853367.56532: checking for any_errors_fatal 15494 1726853367.56533: done checking for any_errors_fatal 15494 1726853367.56534: checking for max_fail_percentage 15494 1726853367.56536: done checking for max_fail_percentage 15494 1726853367.56537: checking to see if all hosts have failed and the running result is not ok 15494 1726853367.56537: done checking to see if all hosts have failed 15494 1726853367.56538: getting the remaining hosts for this loop 15494 1726853367.56545: done getting the remaining hosts for this loop 15494 1726853367.56551: getting the next task for host managed_node1 15494 1726853367.56556: done getting next task for host managed_node1 15494 1726853367.56558: ^ task is: TASK: meta (flush_handlers) 15494 1726853367.56560: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853367.56564: getting variables 15494 1726853367.56566: in VariableManager get_vars() 15494 1726853367.56591: Calling all_inventory to load vars for managed_node1 15494 1726853367.56594: Calling groups_inventory to load vars for managed_node1 15494 1726853367.56598: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853367.56610: Calling all_plugins_play to load vars for managed_node1 15494 1726853367.56613: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853367.56616: Calling groups_plugins_play to load vars for managed_node1 15494 1726853367.58025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853367.59741: done with get_vars() 15494 1726853367.59767: done getting variables 15494 1726853367.59844: in VariableManager get_vars() 15494 1726853367.59856: Calling all_inventory to load vars for managed_node1 15494 1726853367.59859: Calling groups_inventory to load vars for managed_node1 15494 1726853367.59861: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853367.59865: Calling all_plugins_play to load vars for managed_node1 15494 1726853367.59868: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853367.59872: Calling groups_plugins_play to load vars for managed_node1 15494 1726853367.61304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853367.62745: done with get_vars() 15494 1726853367.62769: done queuing things up, now waiting for results queue to drain 15494 1726853367.62773: results queue empty 15494 1726853367.62774: checking for any_errors_fatal 15494 1726853367.62777: done checking for any_errors_fatal 15494 1726853367.62777: checking for max_fail_percentage 15494 1726853367.62778: done checking for max_fail_percentage 15494 1726853367.62782: checking to see if all hosts have failed and the running result is not ok 15494 1726853367.62783: done checking to see if all hosts have failed 15494 1726853367.62784: getting the remaining hosts for this loop 15494 1726853367.62784: done getting the remaining hosts for this loop 15494 1726853367.62786: getting the next task for host managed_node1 15494 1726853367.62789: done getting next task for host managed_node1 15494 1726853367.62791: ^ task is: TASK: Include the task '{{ task }}' 15494 1726853367.62792: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853367.62794: getting variables 15494 1726853367.62794: in VariableManager get_vars() 15494 1726853367.62801: Calling all_inventory to load vars for managed_node1 15494 1726853367.62802: Calling groups_inventory to load vars for managed_node1 15494 1726853367.62804: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853367.62808: Calling all_plugins_play to load vars for managed_node1 15494 1726853367.62809: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853367.62811: Calling groups_plugins_play to load vars for managed_node1 15494 1726853367.63465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853367.64728: done with get_vars() 15494 1726853367.64764: done getting variables 15494 1726853367.65043: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_absent.yml'] ********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 13:29:27 -0400 (0:00:00.934) 0:00:36.266 ****** 15494 1726853367.65086: entering _queue_task() for managed_node1/include_tasks 15494 1726853367.65558: worker is 1 (out of 1 available) 15494 1726853367.65574: exiting _queue_task() for managed_node1/include_tasks 15494 1726853367.65596: done queuing things up, now waiting for results queue to drain 15494 1726853367.65601: waiting for pending results... 15494 1726853367.65941: running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_profile_absent.yml' 15494 1726853367.66261: in run() - task 02083763-bbaf-0028-1a50-000000000073 15494 1726853367.66478: variable 'ansible_search_path' from source: unknown 15494 1726853367.66482: calling self._execute() 15494 1726853367.66484: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853367.66487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853367.66489: variable 'omit' from source: magic vars 15494 1726853367.66556: variable 'ansible_distribution_major_version' from source: facts 15494 1726853367.66574: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853367.66587: variable 'task' from source: play vars 15494 1726853367.66660: variable 'task' from source: play vars 15494 1726853367.66674: _execute() done 15494 1726853367.66682: dumping result to json 15494 1726853367.66690: done dumping result, returning 15494 1726853367.66776: done running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_profile_absent.yml' [02083763-bbaf-0028-1a50-000000000073] 15494 1726853367.66779: sending task result for task 02083763-bbaf-0028-1a50-000000000073 15494 1726853367.67076: done sending task result for task 02083763-bbaf-0028-1a50-000000000073 15494 1726853367.67080: WORKER PROCESS EXITING 15494 1726853367.67103: no more pending results, returning what we have 15494 1726853367.67108: in VariableManager get_vars() 15494 1726853367.67137: Calling all_inventory to load vars for managed_node1 15494 1726853367.67140: Calling groups_inventory to load vars for managed_node1 15494 1726853367.67143: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853367.67157: Calling all_plugins_play to load vars for managed_node1 15494 1726853367.67160: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853367.67163: Calling groups_plugins_play to load vars for managed_node1 15494 1726853367.68656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853367.70017: done with get_vars() 15494 1726853367.70032: variable 'ansible_search_path' from source: unknown 15494 1726853367.70043: we have included files to process 15494 1726853367.70044: generating all_blocks data 15494 1726853367.70045: done generating all_blocks data 15494 1726853367.70045: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15494 1726853367.70046: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15494 1726853367.70048: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15494 1726853367.70164: in VariableManager get_vars() 15494 1726853367.70177: done with get_vars() 15494 1726853367.70254: done processing included file 15494 1726853367.70255: iterating over new_blocks loaded from include file 15494 1726853367.70256: in VariableManager get_vars() 15494 1726853367.70265: done with get_vars() 15494 1726853367.70266: filtering new block on tags 15494 1726853367.70278: done filtering new block on tags 15494 1726853367.70280: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node1 15494 1726853367.70283: extending task lists for all hosts with included blocks 15494 1726853367.70302: done extending task lists 15494 1726853367.70302: done processing included files 15494 1726853367.70303: results queue empty 15494 1726853367.70303: checking for any_errors_fatal 15494 1726853367.70304: done checking for any_errors_fatal 15494 1726853367.70305: checking for max_fail_percentage 15494 1726853367.70305: done checking for max_fail_percentage 15494 1726853367.70306: checking to see if all hosts have failed and the running result is not ok 15494 1726853367.70306: done checking to see if all hosts have failed 15494 1726853367.70307: getting the remaining hosts for this loop 15494 1726853367.70307: done getting the remaining hosts for this loop 15494 1726853367.70309: getting the next task for host managed_node1 15494 1726853367.70311: done getting next task for host managed_node1 15494 1726853367.70313: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15494 1726853367.70314: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853367.70316: getting variables 15494 1726853367.70317: in VariableManager get_vars() 15494 1726853367.70322: Calling all_inventory to load vars for managed_node1 15494 1726853367.70323: Calling groups_inventory to load vars for managed_node1 15494 1726853367.70324: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853367.70328: Calling all_plugins_play to load vars for managed_node1 15494 1726853367.70330: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853367.70331: Calling groups_plugins_play to load vars for managed_node1 15494 1726853367.71582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853367.74179: done with get_vars() 15494 1726853367.74208: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 13:29:27 -0400 (0:00:00.093) 0:00:36.360 ****** 15494 1726853367.74422: entering _queue_task() for managed_node1/include_tasks 15494 1726853367.74892: worker is 1 (out of 1 available) 15494 1726853367.74908: exiting _queue_task() for managed_node1/include_tasks 15494 1726853367.74920: done queuing things up, now waiting for results queue to drain 15494 1726853367.74921: waiting for pending results... 15494 1726853367.75442: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 15494 1726853367.75604: in run() - task 02083763-bbaf-0028-1a50-00000000047f 15494 1726853367.75632: variable 'ansible_search_path' from source: unknown 15494 1726853367.75678: variable 'ansible_search_path' from source: unknown 15494 1726853367.75691: calling self._execute() 15494 1726853367.75805: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853367.75816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853367.75839: variable 'omit' from source: magic vars 15494 1726853367.76256: variable 'ansible_distribution_major_version' from source: facts 15494 1726853367.76379: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853367.76383: _execute() done 15494 1726853367.76386: dumping result to json 15494 1726853367.76389: done dumping result, returning 15494 1726853367.76391: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-0028-1a50-00000000047f] 15494 1726853367.76393: sending task result for task 02083763-bbaf-0028-1a50-00000000047f 15494 1726853367.76467: done sending task result for task 02083763-bbaf-0028-1a50-00000000047f 15494 1726853367.76470: WORKER PROCESS EXITING 15494 1726853367.76511: no more pending results, returning what we have 15494 1726853367.76517: in VariableManager get_vars() 15494 1726853367.76558: Calling all_inventory to load vars for managed_node1 15494 1726853367.76561: Calling groups_inventory to load vars for managed_node1 15494 1726853367.76565: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853367.76604: Calling all_plugins_play to load vars for managed_node1 15494 1726853367.76608: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853367.76612: Calling groups_plugins_play to load vars for managed_node1 15494 1726853367.78653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853367.79551: done with get_vars() 15494 1726853367.79565: variable 'ansible_search_path' from source: unknown 15494 1726853367.79565: variable 'ansible_search_path' from source: unknown 15494 1726853367.79574: variable 'task' from source: play vars 15494 1726853367.79651: variable 'task' from source: play vars 15494 1726853367.79676: we have included files to process 15494 1726853367.79677: generating all_blocks data 15494 1726853367.79678: done generating all_blocks data 15494 1726853367.79679: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15494 1726853367.79679: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15494 1726853367.79681: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15494 1726853367.80457: done processing included file 15494 1726853367.80459: iterating over new_blocks loaded from include file 15494 1726853367.80461: in VariableManager get_vars() 15494 1726853367.80476: done with get_vars() 15494 1726853367.80478: filtering new block on tags 15494 1726853367.80500: done filtering new block on tags 15494 1726853367.80503: in VariableManager get_vars() 15494 1726853367.80514: done with get_vars() 15494 1726853367.80515: filtering new block on tags 15494 1726853367.80534: done filtering new block on tags 15494 1726853367.80536: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 15494 1726853367.80540: extending task lists for all hosts with included blocks 15494 1726853367.80632: done extending task lists 15494 1726853367.80633: done processing included files 15494 1726853367.80634: results queue empty 15494 1726853367.80635: checking for any_errors_fatal 15494 1726853367.80638: done checking for any_errors_fatal 15494 1726853367.80639: checking for max_fail_percentage 15494 1726853367.80639: done checking for max_fail_percentage 15494 1726853367.80640: checking to see if all hosts have failed and the running result is not ok 15494 1726853367.80641: done checking to see if all hosts have failed 15494 1726853367.80642: getting the remaining hosts for this loop 15494 1726853367.80643: done getting the remaining hosts for this loop 15494 1726853367.80645: getting the next task for host managed_node1 15494 1726853367.80649: done getting next task for host managed_node1 15494 1726853367.80652: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15494 1726853367.80655: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853367.80657: getting variables 15494 1726853367.80658: in VariableManager get_vars() 15494 1726853367.80666: Calling all_inventory to load vars for managed_node1 15494 1726853367.80668: Calling groups_inventory to load vars for managed_node1 15494 1726853367.80670: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853367.80677: Calling all_plugins_play to load vars for managed_node1 15494 1726853367.80679: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853367.80682: Calling groups_plugins_play to load vars for managed_node1 15494 1726853367.81833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853367.87443: done with get_vars() 15494 1726853367.87466: done getting variables 15494 1726853367.87510: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:29:27 -0400 (0:00:00.131) 0:00:36.491 ****** 15494 1726853367.87536: entering _queue_task() for managed_node1/set_fact 15494 1726853367.87889: worker is 1 (out of 1 available) 15494 1726853367.87900: exiting _queue_task() for managed_node1/set_fact 15494 1726853367.87913: done queuing things up, now waiting for results queue to drain 15494 1726853367.87915: waiting for pending results... 15494 1726853367.88390: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 15494 1726853367.88395: in run() - task 02083763-bbaf-0028-1a50-00000000048a 15494 1726853367.88398: variable 'ansible_search_path' from source: unknown 15494 1726853367.88401: variable 'ansible_search_path' from source: unknown 15494 1726853367.88404: calling self._execute() 15494 1726853367.88497: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853367.88509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853367.88527: variable 'omit' from source: magic vars 15494 1726853367.88913: variable 'ansible_distribution_major_version' from source: facts 15494 1726853367.88929: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853367.88940: variable 'omit' from source: magic vars 15494 1726853367.88998: variable 'omit' from source: magic vars 15494 1726853367.89046: variable 'omit' from source: magic vars 15494 1726853367.89099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853367.89143: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853367.89175: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853367.89203: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853367.89221: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853367.89259: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853367.89270: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853367.89285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853367.89475: Set connection var ansible_connection to ssh 15494 1726853367.89479: Set connection var ansible_pipelining to False 15494 1726853367.89481: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853367.89483: Set connection var ansible_shell_type to sh 15494 1726853367.89485: Set connection var ansible_timeout to 10 15494 1726853367.89488: Set connection var ansible_shell_executable to /bin/sh 15494 1726853367.89496: variable 'ansible_shell_executable' from source: unknown 15494 1726853367.89498: variable 'ansible_connection' from source: unknown 15494 1726853367.89501: variable 'ansible_module_compression' from source: unknown 15494 1726853367.89503: variable 'ansible_shell_type' from source: unknown 15494 1726853367.89505: variable 'ansible_shell_executable' from source: unknown 15494 1726853367.89507: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853367.89509: variable 'ansible_pipelining' from source: unknown 15494 1726853367.89511: variable 'ansible_timeout' from source: unknown 15494 1726853367.89519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853367.89673: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853367.89693: variable 'omit' from source: magic vars 15494 1726853367.89707: starting attempt loop 15494 1726853367.89720: running the handler 15494 1726853367.89736: handler run complete 15494 1726853367.89749: attempt loop complete, returning result 15494 1726853367.89759: _execute() done 15494 1726853367.89826: dumping result to json 15494 1726853367.89829: done dumping result, returning 15494 1726853367.89831: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-0028-1a50-00000000048a] 15494 1726853367.89833: sending task result for task 02083763-bbaf-0028-1a50-00000000048a 15494 1726853367.89902: done sending task result for task 02083763-bbaf-0028-1a50-00000000048a 15494 1726853367.89905: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15494 1726853367.89985: no more pending results, returning what we have 15494 1726853367.89991: results queue empty 15494 1726853367.89992: checking for any_errors_fatal 15494 1726853367.89994: done checking for any_errors_fatal 15494 1726853367.89994: checking for max_fail_percentage 15494 1726853367.89996: done checking for max_fail_percentage 15494 1726853367.89997: checking to see if all hosts have failed and the running result is not ok 15494 1726853367.89997: done checking to see if all hosts have failed 15494 1726853367.89998: getting the remaining hosts for this loop 15494 1726853367.90000: done getting the remaining hosts for this loop 15494 1726853367.90004: getting the next task for host managed_node1 15494 1726853367.90013: done getting next task for host managed_node1 15494 1726853367.90015: ^ task is: TASK: Stat profile file 15494 1726853367.90019: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853367.90028: getting variables 15494 1726853367.90029: in VariableManager get_vars() 15494 1726853367.90065: Calling all_inventory to load vars for managed_node1 15494 1726853367.90068: Calling groups_inventory to load vars for managed_node1 15494 1726853367.90074: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853367.90085: Calling all_plugins_play to load vars for managed_node1 15494 1726853367.90088: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853367.90093: Calling groups_plugins_play to load vars for managed_node1 15494 1726853367.91856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853367.93653: done with get_vars() 15494 1726853367.93681: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:29:27 -0400 (0:00:00.062) 0:00:36.553 ****** 15494 1726853367.93800: entering _queue_task() for managed_node1/stat 15494 1726853367.94390: worker is 1 (out of 1 available) 15494 1726853367.94399: exiting _queue_task() for managed_node1/stat 15494 1726853367.94409: done queuing things up, now waiting for results queue to drain 15494 1726853367.94410: waiting for pending results... 15494 1726853367.94541: running TaskExecutor() for managed_node1/TASK: Stat profile file 15494 1726853367.94623: in run() - task 02083763-bbaf-0028-1a50-00000000048b 15494 1726853367.94653: variable 'ansible_search_path' from source: unknown 15494 1726853367.94748: variable 'ansible_search_path' from source: unknown 15494 1726853367.94752: calling self._execute() 15494 1726853367.94821: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853367.94833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853367.94857: variable 'omit' from source: magic vars 15494 1726853367.95295: variable 'ansible_distribution_major_version' from source: facts 15494 1726853367.95312: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853367.95323: variable 'omit' from source: magic vars 15494 1726853367.95373: variable 'omit' from source: magic vars 15494 1726853367.95482: variable 'profile' from source: play vars 15494 1726853367.95492: variable 'interface' from source: set_fact 15494 1726853367.95567: variable 'interface' from source: set_fact 15494 1726853367.95591: variable 'omit' from source: magic vars 15494 1726853367.95650: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853367.95694: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853367.95835: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853367.95838: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853367.95841: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853367.95843: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853367.95845: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853367.95847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853367.95927: Set connection var ansible_connection to ssh 15494 1726853367.95952: Set connection var ansible_pipelining to False 15494 1726853367.95966: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853367.95976: Set connection var ansible_shell_type to sh 15494 1726853367.95990: Set connection var ansible_timeout to 10 15494 1726853367.96004: Set connection var ansible_shell_executable to /bin/sh 15494 1726853367.96037: variable 'ansible_shell_executable' from source: unknown 15494 1726853367.96056: variable 'ansible_connection' from source: unknown 15494 1726853367.96064: variable 'ansible_module_compression' from source: unknown 15494 1726853367.96074: variable 'ansible_shell_type' from source: unknown 15494 1726853367.96084: variable 'ansible_shell_executable' from source: unknown 15494 1726853367.96093: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853367.96104: variable 'ansible_pipelining' from source: unknown 15494 1726853367.96112: variable 'ansible_timeout' from source: unknown 15494 1726853367.96120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853367.96350: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853367.96368: variable 'omit' from source: magic vars 15494 1726853367.96488: starting attempt loop 15494 1726853367.96491: running the handler 15494 1726853367.96494: _low_level_execute_command(): starting 15494 1726853367.96496: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853367.97221: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853367.97242: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853367.97264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853367.97389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853367.97407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853367.97490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853367.99208: stdout chunk (state=3): >>>/root <<< 15494 1726853367.99331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853367.99334: stdout chunk (state=3): >>><<< 15494 1726853367.99336: stderr chunk (state=3): >>><<< 15494 1726853367.99453: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853367.99456: _low_level_execute_command(): starting 15494 1726853367.99459: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853367.9936442-17125-77927461399898 `" && echo ansible-tmp-1726853367.9936442-17125-77927461399898="` echo /root/.ansible/tmp/ansible-tmp-1726853367.9936442-17125-77927461399898 `" ) && sleep 0' 15494 1726853367.99990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853368.00006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853368.00029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853368.00052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853368.00073: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853368.00136: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853368.00188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853368.00206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853368.00244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853368.00308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853368.02180: stdout chunk (state=3): >>>ansible-tmp-1726853367.9936442-17125-77927461399898=/root/.ansible/tmp/ansible-tmp-1726853367.9936442-17125-77927461399898 <<< 15494 1726853368.02479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853368.02482: stdout chunk (state=3): >>><<< 15494 1726853368.02485: stderr chunk (state=3): >>><<< 15494 1726853368.02488: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853367.9936442-17125-77927461399898=/root/.ansible/tmp/ansible-tmp-1726853367.9936442-17125-77927461399898 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853368.02491: variable 'ansible_module_compression' from source: unknown 15494 1726853368.02493: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15494 1726853368.02531: variable 'ansible_facts' from source: unknown 15494 1726853368.02644: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853367.9936442-17125-77927461399898/AnsiballZ_stat.py 15494 1726853368.02825: Sending initial data 15494 1726853368.02885: Sent initial data (152 bytes) 15494 1726853368.03443: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853368.03454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853368.03570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853368.03578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853368.03633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853368.05157: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853368.05236: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853368.05362: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp9i0c2j3m /root/.ansible/tmp/ansible-tmp-1726853367.9936442-17125-77927461399898/AnsiballZ_stat.py <<< 15494 1726853368.05365: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853367.9936442-17125-77927461399898/AnsiballZ_stat.py" <<< 15494 1726853368.05465: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmp9i0c2j3m" to remote "/root/.ansible/tmp/ansible-tmp-1726853367.9936442-17125-77927461399898/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853367.9936442-17125-77927461399898/AnsiballZ_stat.py" <<< 15494 1726853368.07127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853368.07130: stdout chunk (state=3): >>><<< 15494 1726853368.07132: stderr chunk (state=3): >>><<< 15494 1726853368.07150: done transferring module to remote 15494 1726853368.07164: _low_level_execute_command(): starting 15494 1726853368.07170: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853367.9936442-17125-77927461399898/ /root/.ansible/tmp/ansible-tmp-1726853367.9936442-17125-77927461399898/AnsiballZ_stat.py && sleep 0' 15494 1726853368.08005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853368.08082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853368.08085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853368.08087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 15494 1726853368.08092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853368.08094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853368.08177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853368.08210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853368.09998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853368.10019: stdout chunk (state=3): >>><<< 15494 1726853368.10032: stderr chunk (state=3): >>><<< 15494 1726853368.10056: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853368.10075: _low_level_execute_command(): starting 15494 1726853368.10078: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853367.9936442-17125-77927461399898/AnsiballZ_stat.py && sleep 0' 15494 1726853368.11834: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853368.11840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853368.11843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853368.11845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853368.12159: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853368.12279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853368.12335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853368.12439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853368.27604: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15494 1726853368.28997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853368.29104: stderr chunk (state=3): >>><<< 15494 1726853368.29109: stdout chunk (state=3): >>><<< 15494 1726853368.29111: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853368.29115: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853367.9936442-17125-77927461399898/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853368.29121: _low_level_execute_command(): starting 15494 1726853368.29124: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853367.9936442-17125-77927461399898/ > /dev/null 2>&1 && sleep 0' 15494 1726853368.30049: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853368.30054: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853368.30215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853368.30220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853368.30222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853368.30225: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853368.30227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853368.30230: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853368.30278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853368.30320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853368.30382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853368.30427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853368.32295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853368.32486: stderr chunk (state=3): >>><<< 15494 1726853368.32489: stdout chunk (state=3): >>><<< 15494 1726853368.32491: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853368.32493: handler run complete 15494 1726853368.32495: attempt loop complete, returning result 15494 1726853368.32496: _execute() done 15494 1726853368.32498: dumping result to json 15494 1726853368.32500: done dumping result, returning 15494 1726853368.32501: done running TaskExecutor() for managed_node1/TASK: Stat profile file [02083763-bbaf-0028-1a50-00000000048b] 15494 1726853368.32503: sending task result for task 02083763-bbaf-0028-1a50-00000000048b 15494 1726853368.32579: done sending task result for task 02083763-bbaf-0028-1a50-00000000048b 15494 1726853368.32582: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15494 1726853368.32639: no more pending results, returning what we have 15494 1726853368.32642: results queue empty 15494 1726853368.32643: checking for any_errors_fatal 15494 1726853368.32653: done checking for any_errors_fatal 15494 1726853368.32654: checking for max_fail_percentage 15494 1726853368.32655: done checking for max_fail_percentage 15494 1726853368.32656: checking to see if all hosts have failed and the running result is not ok 15494 1726853368.32657: done checking to see if all hosts have failed 15494 1726853368.32658: getting the remaining hosts for this loop 15494 1726853368.32659: done getting the remaining hosts for this loop 15494 1726853368.32663: getting the next task for host managed_node1 15494 1726853368.32683: done getting next task for host managed_node1 15494 1726853368.32687: ^ task is: TASK: Set NM profile exist flag based on the profile files 15494 1726853368.32690: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853368.32696: getting variables 15494 1726853368.32698: in VariableManager get_vars() 15494 1726853368.32728: Calling all_inventory to load vars for managed_node1 15494 1726853368.32730: Calling groups_inventory to load vars for managed_node1 15494 1726853368.32734: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853368.32746: Calling all_plugins_play to load vars for managed_node1 15494 1726853368.32751: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853368.32754: Calling groups_plugins_play to load vars for managed_node1 15494 1726853368.34285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853368.35821: done with get_vars() 15494 1726853368.35848: done getting variables 15494 1726853368.35920: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:29:28 -0400 (0:00:00.421) 0:00:36.975 ****** 15494 1726853368.35959: entering _queue_task() for managed_node1/set_fact 15494 1726853368.36329: worker is 1 (out of 1 available) 15494 1726853368.36340: exiting _queue_task() for managed_node1/set_fact 15494 1726853368.36356: done queuing things up, now waiting for results queue to drain 15494 1726853368.36358: waiting for pending results... 15494 1726853368.36799: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 15494 1726853368.36817: in run() - task 02083763-bbaf-0028-1a50-00000000048c 15494 1726853368.36837: variable 'ansible_search_path' from source: unknown 15494 1726853368.36844: variable 'ansible_search_path' from source: unknown 15494 1726853368.36893: calling self._execute() 15494 1726853368.36991: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853368.37008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853368.37025: variable 'omit' from source: magic vars 15494 1726853368.37465: variable 'ansible_distribution_major_version' from source: facts 15494 1726853368.37484: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853368.37611: variable 'profile_stat' from source: set_fact 15494 1726853368.37650: Evaluated conditional (profile_stat.stat.exists): False 15494 1726853368.37655: when evaluation is False, skipping this task 15494 1726853368.37662: _execute() done 15494 1726853368.37665: dumping result to json 15494 1726853368.37667: done dumping result, returning 15494 1726853368.37761: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-0028-1a50-00000000048c] 15494 1726853368.37765: sending task result for task 02083763-bbaf-0028-1a50-00000000048c 15494 1726853368.37847: done sending task result for task 02083763-bbaf-0028-1a50-00000000048c 15494 1726853368.37849: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15494 1726853368.37911: no more pending results, returning what we have 15494 1726853368.37915: results queue empty 15494 1726853368.37917: checking for any_errors_fatal 15494 1726853368.37928: done checking for any_errors_fatal 15494 1726853368.37929: checking for max_fail_percentage 15494 1726853368.37931: done checking for max_fail_percentage 15494 1726853368.37931: checking to see if all hosts have failed and the running result is not ok 15494 1726853368.37932: done checking to see if all hosts have failed 15494 1726853368.37933: getting the remaining hosts for this loop 15494 1726853368.37935: done getting the remaining hosts for this loop 15494 1726853368.37938: getting the next task for host managed_node1 15494 1726853368.37948: done getting next task for host managed_node1 15494 1726853368.37950: ^ task is: TASK: Get NM profile info 15494 1726853368.37955: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853368.37960: getting variables 15494 1726853368.37961: in VariableManager get_vars() 15494 1726853368.38003: Calling all_inventory to load vars for managed_node1 15494 1726853368.38008: Calling groups_inventory to load vars for managed_node1 15494 1726853368.38013: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853368.38031: Calling all_plugins_play to load vars for managed_node1 15494 1726853368.38035: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853368.38043: Calling groups_plugins_play to load vars for managed_node1 15494 1726853368.39764: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853368.42353: done with get_vars() 15494 1726853368.42386: done getting variables 15494 1726853368.42464: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:29:28 -0400 (0:00:00.065) 0:00:37.040 ****** 15494 1726853368.42505: entering _queue_task() for managed_node1/shell 15494 1726853368.42863: worker is 1 (out of 1 available) 15494 1726853368.43079: exiting _queue_task() for managed_node1/shell 15494 1726853368.43091: done queuing things up, now waiting for results queue to drain 15494 1726853368.43093: waiting for pending results... 15494 1726853368.43289: running TaskExecutor() for managed_node1/TASK: Get NM profile info 15494 1726853368.43347: in run() - task 02083763-bbaf-0028-1a50-00000000048d 15494 1726853368.43375: variable 'ansible_search_path' from source: unknown 15494 1726853368.43384: variable 'ansible_search_path' from source: unknown 15494 1726853368.43431: calling self._execute() 15494 1726853368.43551: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853368.43568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853368.43586: variable 'omit' from source: magic vars 15494 1726853368.44173: variable 'ansible_distribution_major_version' from source: facts 15494 1726853368.44179: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853368.44182: variable 'omit' from source: magic vars 15494 1726853368.44184: variable 'omit' from source: magic vars 15494 1726853368.44186: variable 'profile' from source: play vars 15494 1726853368.44189: variable 'interface' from source: set_fact 15494 1726853368.44227: variable 'interface' from source: set_fact 15494 1726853368.44257: variable 'omit' from source: magic vars 15494 1726853368.44390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853368.44442: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853368.44470: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853368.44542: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853368.44564: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853368.44665: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853368.44679: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853368.44688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853368.44913: Set connection var ansible_connection to ssh 15494 1726853368.44929: Set connection var ansible_pipelining to False 15494 1726853368.44970: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853368.44985: Set connection var ansible_shell_type to sh 15494 1726853368.45179: Set connection var ansible_timeout to 10 15494 1726853368.45183: Set connection var ansible_shell_executable to /bin/sh 15494 1726853368.45185: variable 'ansible_shell_executable' from source: unknown 15494 1726853368.45187: variable 'ansible_connection' from source: unknown 15494 1726853368.45190: variable 'ansible_module_compression' from source: unknown 15494 1726853368.45192: variable 'ansible_shell_type' from source: unknown 15494 1726853368.45194: variable 'ansible_shell_executable' from source: unknown 15494 1726853368.45198: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853368.45200: variable 'ansible_pipelining' from source: unknown 15494 1726853368.45202: variable 'ansible_timeout' from source: unknown 15494 1726853368.45205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853368.45546: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853368.45573: variable 'omit' from source: magic vars 15494 1726853368.45621: starting attempt loop 15494 1726853368.45630: running the handler 15494 1726853368.45780: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853368.45784: _low_level_execute_command(): starting 15494 1726853368.45787: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853368.47867: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853368.47979: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853368.48238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853368.48252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853368.48351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853368.48483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853368.50277: stdout chunk (state=3): >>>/root <<< 15494 1726853368.50481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853368.50485: stdout chunk (state=3): >>><<< 15494 1726853368.50491: stderr chunk (state=3): >>><<< 15494 1726853368.50569: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853368.50586: _low_level_execute_command(): starting 15494 1726853368.50596: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853368.5053484-17154-28029648865257 `" && echo ansible-tmp-1726853368.5053484-17154-28029648865257="` echo /root/.ansible/tmp/ansible-tmp-1726853368.5053484-17154-28029648865257 `" ) && sleep 0' 15494 1726853368.51524: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853368.51555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853368.51598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853368.51652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853368.51821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853368.51867: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853368.51870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853368.51935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853368.53845: stdout chunk (state=3): >>>ansible-tmp-1726853368.5053484-17154-28029648865257=/root/.ansible/tmp/ansible-tmp-1726853368.5053484-17154-28029648865257 <<< 15494 1726853368.53991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853368.53994: stdout chunk (state=3): >>><<< 15494 1726853368.53996: stderr chunk (state=3): >>><<< 15494 1726853368.54176: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853368.5053484-17154-28029648865257=/root/.ansible/tmp/ansible-tmp-1726853368.5053484-17154-28029648865257 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853368.54180: variable 'ansible_module_compression' from source: unknown 15494 1726853368.54182: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15494 1726853368.54184: variable 'ansible_facts' from source: unknown 15494 1726853368.54297: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853368.5053484-17154-28029648865257/AnsiballZ_command.py 15494 1726853368.54628: Sending initial data 15494 1726853368.54632: Sent initial data (155 bytes) 15494 1726853368.55156: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853368.55159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853368.55161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15494 1726853368.55163: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853368.55166: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853368.55223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853368.55230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853368.55262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853368.56819: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853368.56862: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853368.56913: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpi1h6d76t /root/.ansible/tmp/ansible-tmp-1726853368.5053484-17154-28029648865257/AnsiballZ_command.py <<< 15494 1726853368.56937: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853368.5053484-17154-28029648865257/AnsiballZ_command.py" <<< 15494 1726853368.56968: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpi1h6d76t" to remote "/root/.ansible/tmp/ansible-tmp-1726853368.5053484-17154-28029648865257/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853368.5053484-17154-28029648865257/AnsiballZ_command.py" <<< 15494 1726853368.57731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853368.57789: stderr chunk (state=3): >>><<< 15494 1726853368.57792: stdout chunk (state=3): >>><<< 15494 1726853368.57807: done transferring module to remote 15494 1726853368.57815: _low_level_execute_command(): starting 15494 1726853368.57820: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853368.5053484-17154-28029648865257/ /root/.ansible/tmp/ansible-tmp-1726853368.5053484-17154-28029648865257/AnsiballZ_command.py && sleep 0' 15494 1726853368.58252: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853368.58258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853368.58265: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853368.58269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853368.58329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853368.58332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853368.58380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853368.60188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853368.60191: stdout chunk (state=3): >>><<< 15494 1726853368.60193: stderr chunk (state=3): >>><<< 15494 1726853368.60209: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853368.60217: _low_level_execute_command(): starting 15494 1726853368.60229: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853368.5053484-17154-28029648865257/AnsiballZ_command.py && sleep 0' 15494 1726853368.60661: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853368.60676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853368.60701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15494 1726853368.60704: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853368.60706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853368.60758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853368.60763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853368.60806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853368.77451: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 13:29:28.757135", "end": "2024-09-20 13:29:28.772923", "delta": "0:00:00.015788", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15494 1726853368.79077: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.45.153 closed. <<< 15494 1726853368.79081: stdout chunk (state=3): >>><<< 15494 1726853368.79083: stderr chunk (state=3): >>><<< 15494 1726853368.79195: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 13:29:28.757135", "end": "2024-09-20 13:29:28.772923", "delta": "0:00:00.015788", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.45.153 closed. 15494 1726853368.79230: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853368.5053484-17154-28029648865257/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853368.79260: _low_level_execute_command(): starting 15494 1726853368.79264: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853368.5053484-17154-28029648865257/ > /dev/null 2>&1 && sleep 0' 15494 1726853368.80073: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853368.80122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853368.80126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853368.80129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853368.80131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853368.80134: stderr chunk (state=3): >>>debug2: match not found <<< 15494 1726853368.80143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853368.80157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853368.80164: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 15494 1726853368.80232: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853368.80235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853368.80275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853368.80305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853368.80386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853368.80432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853368.82257: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853368.82316: stderr chunk (state=3): >>><<< 15494 1726853368.82319: stdout chunk (state=3): >>><<< 15494 1726853368.82324: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853368.82331: handler run complete 15494 1726853368.82355: Evaluated conditional (False): False 15494 1726853368.82364: attempt loop complete, returning result 15494 1726853368.82367: _execute() done 15494 1726853368.82369: dumping result to json 15494 1726853368.82376: done dumping result, returning 15494 1726853368.82383: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [02083763-bbaf-0028-1a50-00000000048d] 15494 1726853368.82387: sending task result for task 02083763-bbaf-0028-1a50-00000000048d 15494 1726853368.82482: done sending task result for task 02083763-bbaf-0028-1a50-00000000048d 15494 1726853368.82484: WORKER PROCESS EXITING fatal: [managed_node1]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.015788", "end": "2024-09-20 13:29:28.772923", "rc": 1, "start": "2024-09-20 13:29:28.757135" } MSG: non-zero return code ...ignoring 15494 1726853368.82563: no more pending results, returning what we have 15494 1726853368.82568: results queue empty 15494 1726853368.82569: checking for any_errors_fatal 15494 1726853368.82579: done checking for any_errors_fatal 15494 1726853368.82580: checking for max_fail_percentage 15494 1726853368.82582: done checking for max_fail_percentage 15494 1726853368.82583: checking to see if all hosts have failed and the running result is not ok 15494 1726853368.82583: done checking to see if all hosts have failed 15494 1726853368.82584: getting the remaining hosts for this loop 15494 1726853368.82586: done getting the remaining hosts for this loop 15494 1726853368.82589: getting the next task for host managed_node1 15494 1726853368.82597: done getting next task for host managed_node1 15494 1726853368.82600: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15494 1726853368.82603: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853368.82607: getting variables 15494 1726853368.82609: in VariableManager get_vars() 15494 1726853368.82638: Calling all_inventory to load vars for managed_node1 15494 1726853368.82640: Calling groups_inventory to load vars for managed_node1 15494 1726853368.82643: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853368.82655: Calling all_plugins_play to load vars for managed_node1 15494 1726853368.82658: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853368.82660: Calling groups_plugins_play to load vars for managed_node1 15494 1726853368.83946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853368.85541: done with get_vars() 15494 1726853368.85567: done getting variables 15494 1726853368.85644: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:29:28 -0400 (0:00:00.431) 0:00:37.472 ****** 15494 1726853368.85684: entering _queue_task() for managed_node1/set_fact 15494 1726853368.86035: worker is 1 (out of 1 available) 15494 1726853368.86048: exiting _queue_task() for managed_node1/set_fact 15494 1726853368.86063: done queuing things up, now waiting for results queue to drain 15494 1726853368.86065: waiting for pending results... 15494 1726853368.86400: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15494 1726853368.86489: in run() - task 02083763-bbaf-0028-1a50-00000000048e 15494 1726853368.86678: variable 'ansible_search_path' from source: unknown 15494 1726853368.86682: variable 'ansible_search_path' from source: unknown 15494 1726853368.86684: calling self._execute() 15494 1726853368.86686: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853368.86688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853368.86691: variable 'omit' from source: magic vars 15494 1726853368.87112: variable 'ansible_distribution_major_version' from source: facts 15494 1726853368.87129: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853368.87264: variable 'nm_profile_exists' from source: set_fact 15494 1726853368.87291: Evaluated conditional (nm_profile_exists.rc == 0): False 15494 1726853368.87299: when evaluation is False, skipping this task 15494 1726853368.87306: _execute() done 15494 1726853368.87312: dumping result to json 15494 1726853368.87318: done dumping result, returning 15494 1726853368.87328: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-0028-1a50-00000000048e] 15494 1726853368.87336: sending task result for task 02083763-bbaf-0028-1a50-00000000048e skipping: [managed_node1] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 15494 1726853368.87475: no more pending results, returning what we have 15494 1726853368.87480: results queue empty 15494 1726853368.87481: checking for any_errors_fatal 15494 1726853368.87488: done checking for any_errors_fatal 15494 1726853368.87489: checking for max_fail_percentage 15494 1726853368.87491: done checking for max_fail_percentage 15494 1726853368.87492: checking to see if all hosts have failed and the running result is not ok 15494 1726853368.87493: done checking to see if all hosts have failed 15494 1726853368.87493: getting the remaining hosts for this loop 15494 1726853368.87495: done getting the remaining hosts for this loop 15494 1726853368.87498: getting the next task for host managed_node1 15494 1726853368.87509: done getting next task for host managed_node1 15494 1726853368.87511: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15494 1726853368.87516: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853368.87520: getting variables 15494 1726853368.87522: in VariableManager get_vars() 15494 1726853368.87552: Calling all_inventory to load vars for managed_node1 15494 1726853368.87555: Calling groups_inventory to load vars for managed_node1 15494 1726853368.87559: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853368.87574: Calling all_plugins_play to load vars for managed_node1 15494 1726853368.87578: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853368.87581: Calling groups_plugins_play to load vars for managed_node1 15494 1726853368.88578: done sending task result for task 02083763-bbaf-0028-1a50-00000000048e 15494 1726853368.88581: WORKER PROCESS EXITING 15494 1726853368.89831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853368.92096: done with get_vars() 15494 1726853368.92119: done getting variables 15494 1726853368.92178: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15494 1726853368.92298: variable 'profile' from source: play vars 15494 1726853368.92302: variable 'interface' from source: set_fact 15494 1726853368.92358: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:29:28 -0400 (0:00:00.067) 0:00:37.539 ****** 15494 1726853368.92393: entering _queue_task() for managed_node1/command 15494 1726853368.92719: worker is 1 (out of 1 available) 15494 1726853368.92730: exiting _queue_task() for managed_node1/command 15494 1726853368.92742: done queuing things up, now waiting for results queue to drain 15494 1726853368.92743: waiting for pending results... 15494 1726853368.93009: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 15494 1726853368.93130: in run() - task 02083763-bbaf-0028-1a50-000000000490 15494 1726853368.93150: variable 'ansible_search_path' from source: unknown 15494 1726853368.93157: variable 'ansible_search_path' from source: unknown 15494 1726853368.93203: calling self._execute() 15494 1726853368.93303: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853368.93315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853368.93328: variable 'omit' from source: magic vars 15494 1726853368.93700: variable 'ansible_distribution_major_version' from source: facts 15494 1726853368.93715: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853368.93827: variable 'profile_stat' from source: set_fact 15494 1726853368.93851: Evaluated conditional (profile_stat.stat.exists): False 15494 1726853368.93858: when evaluation is False, skipping this task 15494 1726853368.93866: _execute() done 15494 1726853368.93875: dumping result to json 15494 1726853368.93883: done dumping result, returning 15494 1726853368.93891: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [02083763-bbaf-0028-1a50-000000000490] 15494 1726853368.93901: sending task result for task 02083763-bbaf-0028-1a50-000000000490 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15494 1726853368.94039: no more pending results, returning what we have 15494 1726853368.94043: results queue empty 15494 1726853368.94044: checking for any_errors_fatal 15494 1726853368.94055: done checking for any_errors_fatal 15494 1726853368.94056: checking for max_fail_percentage 15494 1726853368.94058: done checking for max_fail_percentage 15494 1726853368.94059: checking to see if all hosts have failed and the running result is not ok 15494 1726853368.94060: done checking to see if all hosts have failed 15494 1726853368.94060: getting the remaining hosts for this loop 15494 1726853368.94062: done getting the remaining hosts for this loop 15494 1726853368.94066: getting the next task for host managed_node1 15494 1726853368.94076: done getting next task for host managed_node1 15494 1726853368.94079: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15494 1726853368.94083: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853368.94088: getting variables 15494 1726853368.94089: in VariableManager get_vars() 15494 1726853368.94118: Calling all_inventory to load vars for managed_node1 15494 1726853368.94121: Calling groups_inventory to load vars for managed_node1 15494 1726853368.94125: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853368.94141: Calling all_plugins_play to load vars for managed_node1 15494 1726853368.94144: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853368.94147: Calling groups_plugins_play to load vars for managed_node1 15494 1726853368.94690: done sending task result for task 02083763-bbaf-0028-1a50-000000000490 15494 1726853368.94694: WORKER PROCESS EXITING 15494 1726853368.95735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853368.97345: done with get_vars() 15494 1726853368.97365: done getting variables 15494 1726853368.97421: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15494 1726853368.97526: variable 'profile' from source: play vars 15494 1726853368.97529: variable 'interface' from source: set_fact 15494 1726853368.97585: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:29:28 -0400 (0:00:00.052) 0:00:37.591 ****** 15494 1726853368.97615: entering _queue_task() for managed_node1/set_fact 15494 1726853368.97993: worker is 1 (out of 1 available) 15494 1726853368.98003: exiting _queue_task() for managed_node1/set_fact 15494 1726853368.98014: done queuing things up, now waiting for results queue to drain 15494 1726853368.98015: waiting for pending results... 15494 1726853368.98180: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 15494 1726853368.98302: in run() - task 02083763-bbaf-0028-1a50-000000000491 15494 1726853368.98323: variable 'ansible_search_path' from source: unknown 15494 1726853368.98331: variable 'ansible_search_path' from source: unknown 15494 1726853368.98376: calling self._execute() 15494 1726853368.98468: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853368.98480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853368.98494: variable 'omit' from source: magic vars 15494 1726853368.98860: variable 'ansible_distribution_major_version' from source: facts 15494 1726853368.98878: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853368.99005: variable 'profile_stat' from source: set_fact 15494 1726853368.99023: Evaluated conditional (profile_stat.stat.exists): False 15494 1726853368.99031: when evaluation is False, skipping this task 15494 1726853368.99037: _execute() done 15494 1726853368.99044: dumping result to json 15494 1726853368.99050: done dumping result, returning 15494 1726853368.99060: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [02083763-bbaf-0028-1a50-000000000491] 15494 1726853368.99069: sending task result for task 02083763-bbaf-0028-1a50-000000000491 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15494 1726853368.99202: no more pending results, returning what we have 15494 1726853368.99207: results queue empty 15494 1726853368.99208: checking for any_errors_fatal 15494 1726853368.99215: done checking for any_errors_fatal 15494 1726853368.99216: checking for max_fail_percentage 15494 1726853368.99218: done checking for max_fail_percentage 15494 1726853368.99219: checking to see if all hosts have failed and the running result is not ok 15494 1726853368.99219: done checking to see if all hosts have failed 15494 1726853368.99220: getting the remaining hosts for this loop 15494 1726853368.99222: done getting the remaining hosts for this loop 15494 1726853368.99225: getting the next task for host managed_node1 15494 1726853368.99233: done getting next task for host managed_node1 15494 1726853368.99236: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15494 1726853368.99240: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853368.99245: getting variables 15494 1726853368.99246: in VariableManager get_vars() 15494 1726853368.99277: Calling all_inventory to load vars for managed_node1 15494 1726853368.99279: Calling groups_inventory to load vars for managed_node1 15494 1726853368.99283: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853368.99296: Calling all_plugins_play to load vars for managed_node1 15494 1726853368.99299: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853368.99302: Calling groups_plugins_play to load vars for managed_node1 15494 1726853369.00084: done sending task result for task 02083763-bbaf-0028-1a50-000000000491 15494 1726853369.00087: WORKER PROCESS EXITING 15494 1726853369.00985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853369.02910: done with get_vars() 15494 1726853369.02932: done getting variables 15494 1726853369.03196: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15494 1726853369.03302: variable 'profile' from source: play vars 15494 1726853369.03306: variable 'interface' from source: set_fact 15494 1726853369.03360: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:29:29 -0400 (0:00:00.059) 0:00:37.651 ****** 15494 1726853369.03597: entering _queue_task() for managed_node1/command 15494 1726853369.04105: worker is 1 (out of 1 available) 15494 1726853369.04120: exiting _queue_task() for managed_node1/command 15494 1726853369.04135: done queuing things up, now waiting for results queue to drain 15494 1726853369.04137: waiting for pending results... 15494 1726853369.04988: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 15494 1726853369.04994: in run() - task 02083763-bbaf-0028-1a50-000000000492 15494 1726853369.04997: variable 'ansible_search_path' from source: unknown 15494 1726853369.05090: variable 'ansible_search_path' from source: unknown 15494 1726853369.05133: calling self._execute() 15494 1726853369.05326: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853369.05338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853369.05352: variable 'omit' from source: magic vars 15494 1726853369.06008: variable 'ansible_distribution_major_version' from source: facts 15494 1726853369.06026: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853369.06154: variable 'profile_stat' from source: set_fact 15494 1726853369.06177: Evaluated conditional (profile_stat.stat.exists): False 15494 1726853369.06184: when evaluation is False, skipping this task 15494 1726853369.06191: _execute() done 15494 1726853369.06197: dumping result to json 15494 1726853369.06204: done dumping result, returning 15494 1726853369.06213: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [02083763-bbaf-0028-1a50-000000000492] 15494 1726853369.06222: sending task result for task 02083763-bbaf-0028-1a50-000000000492 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15494 1726853369.06368: no more pending results, returning what we have 15494 1726853369.06374: results queue empty 15494 1726853369.06375: checking for any_errors_fatal 15494 1726853369.06384: done checking for any_errors_fatal 15494 1726853369.06385: checking for max_fail_percentage 15494 1726853369.06387: done checking for max_fail_percentage 15494 1726853369.06387: checking to see if all hosts have failed and the running result is not ok 15494 1726853369.06388: done checking to see if all hosts have failed 15494 1726853369.06389: getting the remaining hosts for this loop 15494 1726853369.06390: done getting the remaining hosts for this loop 15494 1726853369.06394: getting the next task for host managed_node1 15494 1726853369.06402: done getting next task for host managed_node1 15494 1726853369.06406: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15494 1726853369.06410: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853369.06415: getting variables 15494 1726853369.06416: in VariableManager get_vars() 15494 1726853369.06445: Calling all_inventory to load vars for managed_node1 15494 1726853369.06448: Calling groups_inventory to load vars for managed_node1 15494 1726853369.06451: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853369.06465: Calling all_plugins_play to load vars for managed_node1 15494 1726853369.06468: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853369.06576: Calling groups_plugins_play to load vars for managed_node1 15494 1726853369.07283: done sending task result for task 02083763-bbaf-0028-1a50-000000000492 15494 1726853369.07287: WORKER PROCESS EXITING 15494 1726853369.08090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853369.09626: done with get_vars() 15494 1726853369.09648: done getting variables 15494 1726853369.09706: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15494 1726853369.09810: variable 'profile' from source: play vars 15494 1726853369.09814: variable 'interface' from source: set_fact 15494 1726853369.09872: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:29:29 -0400 (0:00:00.063) 0:00:37.714 ****** 15494 1726853369.09903: entering _queue_task() for managed_node1/set_fact 15494 1726853369.10199: worker is 1 (out of 1 available) 15494 1726853369.10211: exiting _queue_task() for managed_node1/set_fact 15494 1726853369.10225: done queuing things up, now waiting for results queue to drain 15494 1726853369.10226: waiting for pending results... 15494 1726853369.10487: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 15494 1726853369.10631: in run() - task 02083763-bbaf-0028-1a50-000000000493 15494 1726853369.10649: variable 'ansible_search_path' from source: unknown 15494 1726853369.10656: variable 'ansible_search_path' from source: unknown 15494 1726853369.10696: calling self._execute() 15494 1726853369.10790: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853369.10801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853369.10816: variable 'omit' from source: magic vars 15494 1726853369.11175: variable 'ansible_distribution_major_version' from source: facts 15494 1726853369.11193: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853369.11320: variable 'profile_stat' from source: set_fact 15494 1726853369.11338: Evaluated conditional (profile_stat.stat.exists): False 15494 1726853369.11345: when evaluation is False, skipping this task 15494 1726853369.11353: _execute() done 15494 1726853369.11361: dumping result to json 15494 1726853369.11372: done dumping result, returning 15494 1726853369.11382: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [02083763-bbaf-0028-1a50-000000000493] 15494 1726853369.11391: sending task result for task 02083763-bbaf-0028-1a50-000000000493 15494 1726853369.11588: done sending task result for task 02083763-bbaf-0028-1a50-000000000493 15494 1726853369.11591: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15494 1726853369.11637: no more pending results, returning what we have 15494 1726853369.11642: results queue empty 15494 1726853369.11642: checking for any_errors_fatal 15494 1726853369.11649: done checking for any_errors_fatal 15494 1726853369.11650: checking for max_fail_percentage 15494 1726853369.11652: done checking for max_fail_percentage 15494 1726853369.11653: checking to see if all hosts have failed and the running result is not ok 15494 1726853369.11654: done checking to see if all hosts have failed 15494 1726853369.11655: getting the remaining hosts for this loop 15494 1726853369.11656: done getting the remaining hosts for this loop 15494 1726853369.11660: getting the next task for host managed_node1 15494 1726853369.11669: done getting next task for host managed_node1 15494 1726853369.11674: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 15494 1726853369.11677: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853369.11681: getting variables 15494 1726853369.11683: in VariableManager get_vars() 15494 1726853369.11712: Calling all_inventory to load vars for managed_node1 15494 1726853369.11715: Calling groups_inventory to load vars for managed_node1 15494 1726853369.11719: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853369.11732: Calling all_plugins_play to load vars for managed_node1 15494 1726853369.11735: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853369.11738: Calling groups_plugins_play to load vars for managed_node1 15494 1726853369.13274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853369.16098: done with get_vars() 15494 1726853369.16126: done getting variables 15494 1726853369.16183: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15494 1726853369.16306: variable 'profile' from source: play vars 15494 1726853369.16310: variable 'interface' from source: set_fact 15494 1726853369.16367: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'LSR-TST-br31'] ********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 13:29:29 -0400 (0:00:00.064) 0:00:37.779 ****** 15494 1726853369.16402: entering _queue_task() for managed_node1/assert 15494 1726853369.16723: worker is 1 (out of 1 available) 15494 1726853369.16734: exiting _queue_task() for managed_node1/assert 15494 1726853369.16746: done queuing things up, now waiting for results queue to drain 15494 1726853369.16747: waiting for pending results... 15494 1726853369.17190: running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'LSR-TST-br31' 15494 1726853369.17195: in run() - task 02083763-bbaf-0028-1a50-000000000480 15494 1726853369.17198: variable 'ansible_search_path' from source: unknown 15494 1726853369.17200: variable 'ansible_search_path' from source: unknown 15494 1726853369.17203: calling self._execute() 15494 1726853369.17291: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853369.17302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853369.17320: variable 'omit' from source: magic vars 15494 1726853369.17684: variable 'ansible_distribution_major_version' from source: facts 15494 1726853369.17700: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853369.17712: variable 'omit' from source: magic vars 15494 1726853369.17752: variable 'omit' from source: magic vars 15494 1726853369.17851: variable 'profile' from source: play vars 15494 1726853369.17866: variable 'interface' from source: set_fact 15494 1726853369.17932: variable 'interface' from source: set_fact 15494 1726853369.17955: variable 'omit' from source: magic vars 15494 1726853369.18006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853369.18046: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853369.18074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853369.18099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853369.18115: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853369.18190: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853369.18193: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853369.18195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853369.18265: Set connection var ansible_connection to ssh 15494 1726853369.18277: Set connection var ansible_pipelining to False 15494 1726853369.18288: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853369.18298: Set connection var ansible_shell_type to sh 15494 1726853369.18308: Set connection var ansible_timeout to 10 15494 1726853369.18319: Set connection var ansible_shell_executable to /bin/sh 15494 1726853369.18345: variable 'ansible_shell_executable' from source: unknown 15494 1726853369.18406: variable 'ansible_connection' from source: unknown 15494 1726853369.18410: variable 'ansible_module_compression' from source: unknown 15494 1726853369.18412: variable 'ansible_shell_type' from source: unknown 15494 1726853369.18414: variable 'ansible_shell_executable' from source: unknown 15494 1726853369.18416: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853369.18418: variable 'ansible_pipelining' from source: unknown 15494 1726853369.18420: variable 'ansible_timeout' from source: unknown 15494 1726853369.18422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853369.18541: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853369.18560: variable 'omit' from source: magic vars 15494 1726853369.18574: starting attempt loop 15494 1726853369.18581: running the handler 15494 1726853369.18702: variable 'lsr_net_profile_exists' from source: set_fact 15494 1726853369.18713: Evaluated conditional (not lsr_net_profile_exists): True 15494 1726853369.18732: handler run complete 15494 1726853369.18745: attempt loop complete, returning result 15494 1726853369.18841: _execute() done 15494 1726853369.18844: dumping result to json 15494 1726853369.18847: done dumping result, returning 15494 1726853369.18849: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is absent - 'LSR-TST-br31' [02083763-bbaf-0028-1a50-000000000480] 15494 1726853369.18851: sending task result for task 02083763-bbaf-0028-1a50-000000000480 15494 1726853369.18914: done sending task result for task 02083763-bbaf-0028-1a50-000000000480 15494 1726853369.18917: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15494 1726853369.18994: no more pending results, returning what we have 15494 1726853369.18997: results queue empty 15494 1726853369.18999: checking for any_errors_fatal 15494 1726853369.19009: done checking for any_errors_fatal 15494 1726853369.19010: checking for max_fail_percentage 15494 1726853369.19012: done checking for max_fail_percentage 15494 1726853369.19013: checking to see if all hosts have failed and the running result is not ok 15494 1726853369.19013: done checking to see if all hosts have failed 15494 1726853369.19014: getting the remaining hosts for this loop 15494 1726853369.19016: done getting the remaining hosts for this loop 15494 1726853369.19019: getting the next task for host managed_node1 15494 1726853369.19030: done getting next task for host managed_node1 15494 1726853369.19032: ^ task is: TASK: meta (flush_handlers) 15494 1726853369.19034: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853369.19038: getting variables 15494 1726853369.19040: in VariableManager get_vars() 15494 1726853369.19069: Calling all_inventory to load vars for managed_node1 15494 1726853369.19073: Calling groups_inventory to load vars for managed_node1 15494 1726853369.19077: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853369.19090: Calling all_plugins_play to load vars for managed_node1 15494 1726853369.19093: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853369.19096: Calling groups_plugins_play to load vars for managed_node1 15494 1726853369.20617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853369.22207: done with get_vars() 15494 1726853369.22230: done getting variables 15494 1726853369.22296: in VariableManager get_vars() 15494 1726853369.22306: Calling all_inventory to load vars for managed_node1 15494 1726853369.22309: Calling groups_inventory to load vars for managed_node1 15494 1726853369.22311: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853369.22315: Calling all_plugins_play to load vars for managed_node1 15494 1726853369.22318: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853369.22320: Calling groups_plugins_play to load vars for managed_node1 15494 1726853369.24323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853369.26402: done with get_vars() 15494 1726853369.26432: done queuing things up, now waiting for results queue to drain 15494 1726853369.26434: results queue empty 15494 1726853369.26435: checking for any_errors_fatal 15494 1726853369.26437: done checking for any_errors_fatal 15494 1726853369.26438: checking for max_fail_percentage 15494 1726853369.26439: done checking for max_fail_percentage 15494 1726853369.26440: checking to see if all hosts have failed and the running result is not ok 15494 1726853369.26448: done checking to see if all hosts have failed 15494 1726853369.26449: getting the remaining hosts for this loop 15494 1726853369.26450: done getting the remaining hosts for this loop 15494 1726853369.26453: getting the next task for host managed_node1 15494 1726853369.26456: done getting next task for host managed_node1 15494 1726853369.26458: ^ task is: TASK: meta (flush_handlers) 15494 1726853369.26459: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853369.26462: getting variables 15494 1726853369.26463: in VariableManager get_vars() 15494 1726853369.26473: Calling all_inventory to load vars for managed_node1 15494 1726853369.26475: Calling groups_inventory to load vars for managed_node1 15494 1726853369.26478: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853369.26483: Calling all_plugins_play to load vars for managed_node1 15494 1726853369.26485: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853369.26488: Calling groups_plugins_play to load vars for managed_node1 15494 1726853369.27628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853369.29229: done with get_vars() 15494 1726853369.29249: done getting variables 15494 1726853369.29300: in VariableManager get_vars() 15494 1726853369.29309: Calling all_inventory to load vars for managed_node1 15494 1726853369.29312: Calling groups_inventory to load vars for managed_node1 15494 1726853369.29314: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853369.29318: Calling all_plugins_play to load vars for managed_node1 15494 1726853369.29320: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853369.29323: Calling groups_plugins_play to load vars for managed_node1 15494 1726853369.30453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853369.31990: done with get_vars() 15494 1726853369.32017: done queuing things up, now waiting for results queue to drain 15494 1726853369.32019: results queue empty 15494 1726853369.32020: checking for any_errors_fatal 15494 1726853369.32021: done checking for any_errors_fatal 15494 1726853369.32022: checking for max_fail_percentage 15494 1726853369.32023: done checking for max_fail_percentage 15494 1726853369.32023: checking to see if all hosts have failed and the running result is not ok 15494 1726853369.32024: done checking to see if all hosts have failed 15494 1726853369.32025: getting the remaining hosts for this loop 15494 1726853369.32026: done getting the remaining hosts for this loop 15494 1726853369.32028: getting the next task for host managed_node1 15494 1726853369.32031: done getting next task for host managed_node1 15494 1726853369.32032: ^ task is: None 15494 1726853369.32034: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853369.32035: done queuing things up, now waiting for results queue to drain 15494 1726853369.32036: results queue empty 15494 1726853369.32036: checking for any_errors_fatal 15494 1726853369.32037: done checking for any_errors_fatal 15494 1726853369.32038: checking for max_fail_percentage 15494 1726853369.32039: done checking for max_fail_percentage 15494 1726853369.32039: checking to see if all hosts have failed and the running result is not ok 15494 1726853369.32040: done checking to see if all hosts have failed 15494 1726853369.32041: getting the next task for host managed_node1 15494 1726853369.32043: done getting next task for host managed_node1 15494 1726853369.32044: ^ task is: None 15494 1726853369.32045: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853369.32092: in VariableManager get_vars() 15494 1726853369.32109: done with get_vars() 15494 1726853369.32115: in VariableManager get_vars() 15494 1726853369.32125: done with get_vars() 15494 1726853369.32130: variable 'omit' from source: magic vars 15494 1726853369.32243: variable 'task' from source: play vars 15494 1726853369.32477: in VariableManager get_vars() 15494 1726853369.32489: done with get_vars() 15494 1726853369.32508: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_absent.yml] ************************* 15494 1726853369.32931: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15494 1726853369.33178: getting the remaining hosts for this loop 15494 1726853369.33179: done getting the remaining hosts for this loop 15494 1726853369.33182: getting the next task for host managed_node1 15494 1726853369.33184: done getting next task for host managed_node1 15494 1726853369.33186: ^ task is: TASK: Gathering Facts 15494 1726853369.33188: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853369.33189: getting variables 15494 1726853369.33190: in VariableManager get_vars() 15494 1726853369.33198: Calling all_inventory to load vars for managed_node1 15494 1726853369.33200: Calling groups_inventory to load vars for managed_node1 15494 1726853369.33202: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853369.33207: Calling all_plugins_play to load vars for managed_node1 15494 1726853369.33210: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853369.33213: Calling groups_plugins_play to load vars for managed_node1 15494 1726853369.35700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853369.38875: done with get_vars() 15494 1726853369.38896: done getting variables 15494 1726853369.38941: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 13:29:29 -0400 (0:00:00.225) 0:00:38.005 ****** 15494 1726853369.38966: entering _queue_task() for managed_node1/gather_facts 15494 1726853369.39596: worker is 1 (out of 1 available) 15494 1726853369.39610: exiting _queue_task() for managed_node1/gather_facts 15494 1726853369.39622: done queuing things up, now waiting for results queue to drain 15494 1726853369.39624: waiting for pending results... 15494 1726853369.40388: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15494 1726853369.40487: in run() - task 02083763-bbaf-0028-1a50-0000000004c5 15494 1726853369.40491: variable 'ansible_search_path' from source: unknown 15494 1726853369.40495: calling self._execute() 15494 1726853369.40668: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853369.40682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853369.40698: variable 'omit' from source: magic vars 15494 1726853369.41776: variable 'ansible_distribution_major_version' from source: facts 15494 1726853369.41976: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853369.41980: variable 'omit' from source: magic vars 15494 1726853369.41983: variable 'omit' from source: magic vars 15494 1726853369.41985: variable 'omit' from source: magic vars 15494 1726853369.41987: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853369.41990: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853369.41992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853369.42183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853369.42200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853369.42234: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853369.42243: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853369.42256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853369.42361: Set connection var ansible_connection to ssh 15494 1726853369.42776: Set connection var ansible_pipelining to False 15494 1726853369.42779: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853369.42781: Set connection var ansible_shell_type to sh 15494 1726853369.42783: Set connection var ansible_timeout to 10 15494 1726853369.42785: Set connection var ansible_shell_executable to /bin/sh 15494 1726853369.42787: variable 'ansible_shell_executable' from source: unknown 15494 1726853369.42788: variable 'ansible_connection' from source: unknown 15494 1726853369.42794: variable 'ansible_module_compression' from source: unknown 15494 1726853369.42797: variable 'ansible_shell_type' from source: unknown 15494 1726853369.42799: variable 'ansible_shell_executable' from source: unknown 15494 1726853369.42801: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853369.42803: variable 'ansible_pipelining' from source: unknown 15494 1726853369.42805: variable 'ansible_timeout' from source: unknown 15494 1726853369.42807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853369.43029: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853369.43046: variable 'omit' from source: magic vars 15494 1726853369.43058: starting attempt loop 15494 1726853369.43064: running the handler 15494 1726853369.43085: variable 'ansible_facts' from source: unknown 15494 1726853369.43109: _low_level_execute_command(): starting 15494 1726853369.43122: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853369.44532: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853369.44551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15494 1726853369.44566: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853369.44785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853369.44823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853369.46509: stdout chunk (state=3): >>>/root <<< 15494 1726853369.46604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853369.46721: stderr chunk (state=3): >>><<< 15494 1726853369.46736: stdout chunk (state=3): >>><<< 15494 1726853369.46773: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853369.46864: _low_level_execute_command(): starting 15494 1726853369.46878: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853369.468464-17191-190978135030125 `" && echo ansible-tmp-1726853369.468464-17191-190978135030125="` echo /root/.ansible/tmp/ansible-tmp-1726853369.468464-17191-190978135030125 `" ) && sleep 0' 15494 1726853369.48015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853369.48030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853369.48141: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853369.48161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853369.48361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853369.48364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853369.48495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853369.48577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853369.50616: stdout chunk (state=3): >>>ansible-tmp-1726853369.468464-17191-190978135030125=/root/.ansible/tmp/ansible-tmp-1726853369.468464-17191-190978135030125 <<< 15494 1726853369.50792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853369.50825: stderr chunk (state=3): >>><<< 15494 1726853369.50854: stdout chunk (state=3): >>><<< 15494 1726853369.51065: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853369.468464-17191-190978135030125=/root/.ansible/tmp/ansible-tmp-1726853369.468464-17191-190978135030125 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853369.51069: variable 'ansible_module_compression' from source: unknown 15494 1726853369.51130: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15494 1726853369.51307: variable 'ansible_facts' from source: unknown 15494 1726853369.51921: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853369.468464-17191-190978135030125/AnsiballZ_setup.py 15494 1726853369.52492: Sending initial data 15494 1726853369.52880: Sent initial data (153 bytes) 15494 1726853369.53914: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853369.54290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853369.54341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853369.54355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853369.54679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853369.56290: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853369.56402: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853369.56435: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpf8dn1e7d /root/.ansible/tmp/ansible-tmp-1726853369.468464-17191-190978135030125/AnsiballZ_setup.py <<< 15494 1726853369.56438: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853369.468464-17191-190978135030125/AnsiballZ_setup.py" <<< 15494 1726853369.56519: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpf8dn1e7d" to remote "/root/.ansible/tmp/ansible-tmp-1726853369.468464-17191-190978135030125/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853369.468464-17191-190978135030125/AnsiballZ_setup.py" <<< 15494 1726853369.59364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853369.59598: stdout chunk (state=3): >>><<< 15494 1726853369.59601: stderr chunk (state=3): >>><<< 15494 1726853369.59604: done transferring module to remote 15494 1726853369.59606: _low_level_execute_command(): starting 15494 1726853369.59608: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853369.468464-17191-190978135030125/ /root/.ansible/tmp/ansible-tmp-1726853369.468464-17191-190978135030125/AnsiballZ_setup.py && sleep 0' 15494 1726853369.60987: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853369.61235: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853369.61301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853369.63150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853369.63161: stdout chunk (state=3): >>><<< 15494 1726853369.63173: stderr chunk (state=3): >>><<< 15494 1726853369.63195: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853369.63212: _low_level_execute_command(): starting 15494 1726853369.63221: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853369.468464-17191-190978135030125/AnsiballZ_setup.py && sleep 0' 15494 1726853369.64388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853369.64488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853369.64491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853369.64757: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853370.28887: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "29", "epoch": "1726853369", "epoch_int": "1726853369", "date": "2024-09-20", "time": "13:29:29", "iso8601_micro": "2024-09-20T17:29:29.921452Z", "iso8601": "2024-09-20T17:29:29Z", "iso8601_basic": "20240920T132929921452", "iso8601_basic_short": "20240920T132929", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dist<<< 15494 1726853370.28929: stdout chunk (state=3): >>>ribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2958, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 573, "free": 2958}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 536, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794951168, "block_size": 4096, "block_total": 65519099, "block_available": 63914783, "block_used": 1604316, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_is_chroot": false, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.47509765625, "5m": 0.3486328125, "15m": 0.158203125}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15494 1726853370.30910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853370.30930: stdout chunk (state=3): >>><<< 15494 1726853370.30943: stderr chunk (state=3): >>><<< 15494 1726853370.30992: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "29", "epoch": "1726853369", "epoch_int": "1726853369", "date": "2024-09-20", "time": "13:29:29", "iso8601_micro": "2024-09-20T17:29:29.921452Z", "iso8601": "2024-09-20T17:29:29Z", "iso8601_basic": "20240920T132929921452", "iso8601_basic_short": "20240920T132929", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2958, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 573, "free": 2958}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 536, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794951168, "block_size": 4096, "block_total": 65519099, "block_available": 63914783, "block_used": 1604316, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_is_chroot": false, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.47509765625, "5m": 0.3486328125, "15m": 0.158203125}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853370.31405: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853369.468464-17191-190978135030125/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853370.31422: _low_level_execute_command(): starting 15494 1726853370.31500: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853369.468464-17191-190978135030125/ > /dev/null 2>&1 && sleep 0' 15494 1726853370.32079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853370.32093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853370.32176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853370.32215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853370.32231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853370.32249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853370.32323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853370.34150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853370.34181: stdout chunk (state=3): >>><<< 15494 1726853370.34185: stderr chunk (state=3): >>><<< 15494 1726853370.34276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853370.34280: handler run complete 15494 1726853370.34349: variable 'ansible_facts' from source: unknown 15494 1726853370.34466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853370.34796: variable 'ansible_facts' from source: unknown 15494 1726853370.34891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853370.35030: attempt loop complete, returning result 15494 1726853370.35041: _execute() done 15494 1726853370.35056: dumping result to json 15494 1726853370.35155: done dumping result, returning 15494 1726853370.35158: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-0028-1a50-0000000004c5] 15494 1726853370.35161: sending task result for task 02083763-bbaf-0028-1a50-0000000004c5 15494 1726853370.35691: done sending task result for task 02083763-bbaf-0028-1a50-0000000004c5 15494 1726853370.35695: WORKER PROCESS EXITING ok: [managed_node1] 15494 1726853370.36060: no more pending results, returning what we have 15494 1726853370.36063: results queue empty 15494 1726853370.36064: checking for any_errors_fatal 15494 1726853370.36065: done checking for any_errors_fatal 15494 1726853370.36066: checking for max_fail_percentage 15494 1726853370.36068: done checking for max_fail_percentage 15494 1726853370.36068: checking to see if all hosts have failed and the running result is not ok 15494 1726853370.36069: done checking to see if all hosts have failed 15494 1726853370.36070: getting the remaining hosts for this loop 15494 1726853370.36073: done getting the remaining hosts for this loop 15494 1726853370.36076: getting the next task for host managed_node1 15494 1726853370.36081: done getting next task for host managed_node1 15494 1726853370.36083: ^ task is: TASK: meta (flush_handlers) 15494 1726853370.36085: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853370.36088: getting variables 15494 1726853370.36091: in VariableManager get_vars() 15494 1726853370.36111: Calling all_inventory to load vars for managed_node1 15494 1726853370.36114: Calling groups_inventory to load vars for managed_node1 15494 1726853370.36117: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853370.36189: Calling all_plugins_play to load vars for managed_node1 15494 1726853370.36193: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853370.36197: Calling groups_plugins_play to load vars for managed_node1 15494 1726853370.37524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853370.38537: done with get_vars() 15494 1726853370.38555: done getting variables 15494 1726853370.38607: in VariableManager get_vars() 15494 1726853370.38615: Calling all_inventory to load vars for managed_node1 15494 1726853370.38617: Calling groups_inventory to load vars for managed_node1 15494 1726853370.38619: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853370.38623: Calling all_plugins_play to load vars for managed_node1 15494 1726853370.38625: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853370.38627: Calling groups_plugins_play to load vars for managed_node1 15494 1726853370.39378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853370.40828: done with get_vars() 15494 1726853370.40860: done queuing things up, now waiting for results queue to drain 15494 1726853370.40863: results queue empty 15494 1726853370.40864: checking for any_errors_fatal 15494 1726853370.40870: done checking for any_errors_fatal 15494 1726853370.40872: checking for max_fail_percentage 15494 1726853370.40873: done checking for max_fail_percentage 15494 1726853370.40879: checking to see if all hosts have failed and the running result is not ok 15494 1726853370.40880: done checking to see if all hosts have failed 15494 1726853370.40880: getting the remaining hosts for this loop 15494 1726853370.40881: done getting the remaining hosts for this loop 15494 1726853370.40885: getting the next task for host managed_node1 15494 1726853370.40889: done getting next task for host managed_node1 15494 1726853370.40892: ^ task is: TASK: Include the task '{{ task }}' 15494 1726853370.40893: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853370.40895: getting variables 15494 1726853370.40896: in VariableManager get_vars() 15494 1726853370.40906: Calling all_inventory to load vars for managed_node1 15494 1726853370.40908: Calling groups_inventory to load vars for managed_node1 15494 1726853370.40910: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853370.40916: Calling all_plugins_play to load vars for managed_node1 15494 1726853370.40917: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853370.40920: Calling groups_plugins_play to load vars for managed_node1 15494 1726853370.42063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853370.42927: done with get_vars() 15494 1726853370.42941: done getting variables 15494 1726853370.43065: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_absent.yml'] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 13:29:30 -0400 (0:00:01.041) 0:00:39.046 ****** 15494 1726853370.43089: entering _queue_task() for managed_node1/include_tasks 15494 1726853370.43345: worker is 1 (out of 1 available) 15494 1726853370.43361: exiting _queue_task() for managed_node1/include_tasks 15494 1726853370.43375: done queuing things up, now waiting for results queue to drain 15494 1726853370.43376: waiting for pending results... 15494 1726853370.43546: running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_device_absent.yml' 15494 1726853370.43634: in run() - task 02083763-bbaf-0028-1a50-000000000077 15494 1726853370.43645: variable 'ansible_search_path' from source: unknown 15494 1726853370.43679: calling self._execute() 15494 1726853370.43756: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853370.43759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853370.43769: variable 'omit' from source: magic vars 15494 1726853370.44128: variable 'ansible_distribution_major_version' from source: facts 15494 1726853370.44377: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853370.44381: variable 'task' from source: play vars 15494 1726853370.44383: variable 'task' from source: play vars 15494 1726853370.44386: _execute() done 15494 1726853370.44388: dumping result to json 15494 1726853370.44390: done dumping result, returning 15494 1726853370.44392: done running TaskExecutor() for managed_node1/TASK: Include the task 'tasks/assert_device_absent.yml' [02083763-bbaf-0028-1a50-000000000077] 15494 1726853370.44394: sending task result for task 02083763-bbaf-0028-1a50-000000000077 15494 1726853370.44480: done sending task result for task 02083763-bbaf-0028-1a50-000000000077 15494 1726853370.44484: WORKER PROCESS EXITING 15494 1726853370.44512: no more pending results, returning what we have 15494 1726853370.44519: in VariableManager get_vars() 15494 1726853370.44556: Calling all_inventory to load vars for managed_node1 15494 1726853370.44559: Calling groups_inventory to load vars for managed_node1 15494 1726853370.44563: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853370.44581: Calling all_plugins_play to load vars for managed_node1 15494 1726853370.44584: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853370.44587: Calling groups_plugins_play to load vars for managed_node1 15494 1726853370.45578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853370.46457: done with get_vars() 15494 1726853370.46470: variable 'ansible_search_path' from source: unknown 15494 1726853370.46482: we have included files to process 15494 1726853370.46483: generating all_blocks data 15494 1726853370.46484: done generating all_blocks data 15494 1726853370.46485: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15494 1726853370.46485: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15494 1726853370.46487: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15494 1726853370.46559: in VariableManager get_vars() 15494 1726853370.46573: done with get_vars() 15494 1726853370.46646: done processing included file 15494 1726853370.46649: iterating over new_blocks loaded from include file 15494 1726853370.46650: in VariableManager get_vars() 15494 1726853370.46658: done with get_vars() 15494 1726853370.46659: filtering new block on tags 15494 1726853370.46669: done filtering new block on tags 15494 1726853370.46672: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node1 15494 1726853370.46677: extending task lists for all hosts with included blocks 15494 1726853370.46696: done extending task lists 15494 1726853370.46697: done processing included files 15494 1726853370.46697: results queue empty 15494 1726853370.46697: checking for any_errors_fatal 15494 1726853370.46698: done checking for any_errors_fatal 15494 1726853370.46699: checking for max_fail_percentage 15494 1726853370.46699: done checking for max_fail_percentage 15494 1726853370.46700: checking to see if all hosts have failed and the running result is not ok 15494 1726853370.46700: done checking to see if all hosts have failed 15494 1726853370.46701: getting the remaining hosts for this loop 15494 1726853370.46702: done getting the remaining hosts for this loop 15494 1726853370.46703: getting the next task for host managed_node1 15494 1726853370.46706: done getting next task for host managed_node1 15494 1726853370.46707: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15494 1726853370.46709: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853370.46710: getting variables 15494 1726853370.46711: in VariableManager get_vars() 15494 1726853370.46716: Calling all_inventory to load vars for managed_node1 15494 1726853370.46718: Calling groups_inventory to load vars for managed_node1 15494 1726853370.46719: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853370.46723: Calling all_plugins_play to load vars for managed_node1 15494 1726853370.46724: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853370.46726: Calling groups_plugins_play to load vars for managed_node1 15494 1726853370.51593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853370.52621: done with get_vars() 15494 1726853370.52646: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 13:29:30 -0400 (0:00:00.096) 0:00:39.142 ****** 15494 1726853370.52706: entering _queue_task() for managed_node1/include_tasks 15494 1726853370.53026: worker is 1 (out of 1 available) 15494 1726853370.53039: exiting _queue_task() for managed_node1/include_tasks 15494 1726853370.53054: done queuing things up, now waiting for results queue to drain 15494 1726853370.53055: waiting for pending results... 15494 1726853370.53232: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 15494 1726853370.53327: in run() - task 02083763-bbaf-0028-1a50-0000000004d6 15494 1726853370.53336: variable 'ansible_search_path' from source: unknown 15494 1726853370.53340: variable 'ansible_search_path' from source: unknown 15494 1726853370.53369: calling self._execute() 15494 1726853370.53451: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853370.53455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853370.53464: variable 'omit' from source: magic vars 15494 1726853370.53754: variable 'ansible_distribution_major_version' from source: facts 15494 1726853370.53763: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853370.53767: _execute() done 15494 1726853370.53773: dumping result to json 15494 1726853370.53776: done dumping result, returning 15494 1726853370.53784: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-0028-1a50-0000000004d6] 15494 1726853370.53787: sending task result for task 02083763-bbaf-0028-1a50-0000000004d6 15494 1726853370.53878: done sending task result for task 02083763-bbaf-0028-1a50-0000000004d6 15494 1726853370.53881: WORKER PROCESS EXITING 15494 1726853370.53907: no more pending results, returning what we have 15494 1726853370.53912: in VariableManager get_vars() 15494 1726853370.53943: Calling all_inventory to load vars for managed_node1 15494 1726853370.53945: Calling groups_inventory to load vars for managed_node1 15494 1726853370.53951: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853370.53964: Calling all_plugins_play to load vars for managed_node1 15494 1726853370.53966: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853370.53969: Calling groups_plugins_play to load vars for managed_node1 15494 1726853370.54791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853370.55778: done with get_vars() 15494 1726853370.55803: variable 'ansible_search_path' from source: unknown 15494 1726853370.55805: variable 'ansible_search_path' from source: unknown 15494 1726853370.55813: variable 'task' from source: play vars 15494 1726853370.55939: variable 'task' from source: play vars 15494 1726853370.55969: we have included files to process 15494 1726853370.55970: generating all_blocks data 15494 1726853370.55973: done generating all_blocks data 15494 1726853370.55975: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15494 1726853370.55976: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15494 1726853370.55978: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15494 1726853370.56177: done processing included file 15494 1726853370.56179: iterating over new_blocks loaded from include file 15494 1726853370.56181: in VariableManager get_vars() 15494 1726853370.56198: done with get_vars() 15494 1726853370.56200: filtering new block on tags 15494 1726853370.56216: done filtering new block on tags 15494 1726853370.56218: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 15494 1726853370.56221: extending task lists for all hosts with included blocks 15494 1726853370.56331: done extending task lists 15494 1726853370.56332: done processing included files 15494 1726853370.56333: results queue empty 15494 1726853370.56333: checking for any_errors_fatal 15494 1726853370.56341: done checking for any_errors_fatal 15494 1726853370.56342: checking for max_fail_percentage 15494 1726853370.56343: done checking for max_fail_percentage 15494 1726853370.56343: checking to see if all hosts have failed and the running result is not ok 15494 1726853370.56344: done checking to see if all hosts have failed 15494 1726853370.56345: getting the remaining hosts for this loop 15494 1726853370.56346: done getting the remaining hosts for this loop 15494 1726853370.56348: getting the next task for host managed_node1 15494 1726853370.56353: done getting next task for host managed_node1 15494 1726853370.56355: ^ task is: TASK: Get stat for interface {{ interface }} 15494 1726853370.56358: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853370.56360: getting variables 15494 1726853370.56361: in VariableManager get_vars() 15494 1726853370.56369: Calling all_inventory to load vars for managed_node1 15494 1726853370.56373: Calling groups_inventory to load vars for managed_node1 15494 1726853370.56376: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853370.56386: Calling all_plugins_play to load vars for managed_node1 15494 1726853370.56389: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853370.56393: Calling groups_plugins_play to load vars for managed_node1 15494 1726853370.58112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853370.60437: done with get_vars() 15494 1726853370.60465: done getting variables 15494 1726853370.60619: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:29:30 -0400 (0:00:00.079) 0:00:39.222 ****** 15494 1726853370.60654: entering _queue_task() for managed_node1/stat 15494 1726853370.61085: worker is 1 (out of 1 available) 15494 1726853370.61098: exiting _queue_task() for managed_node1/stat 15494 1726853370.61112: done queuing things up, now waiting for results queue to drain 15494 1726853370.61113: waiting for pending results... 15494 1726853370.61492: running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 15494 1726853370.61514: in run() - task 02083763-bbaf-0028-1a50-0000000004e1 15494 1726853370.61537: variable 'ansible_search_path' from source: unknown 15494 1726853370.61589: variable 'ansible_search_path' from source: unknown 15494 1726853370.61593: calling self._execute() 15494 1726853370.61704: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853370.61879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853370.61884: variable 'omit' from source: magic vars 15494 1726853370.62114: variable 'ansible_distribution_major_version' from source: facts 15494 1726853370.62127: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853370.62143: variable 'omit' from source: magic vars 15494 1726853370.62194: variable 'omit' from source: magic vars 15494 1726853370.62331: variable 'interface' from source: set_fact 15494 1726853370.62354: variable 'omit' from source: magic vars 15494 1726853370.62404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853370.62439: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853370.62460: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853370.62491: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853370.62494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853370.62519: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853370.62523: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853370.62525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853370.62653: Set connection var ansible_connection to ssh 15494 1726853370.62657: Set connection var ansible_pipelining to False 15494 1726853370.62660: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853370.62662: Set connection var ansible_shell_type to sh 15494 1726853370.62664: Set connection var ansible_timeout to 10 15494 1726853370.62667: Set connection var ansible_shell_executable to /bin/sh 15494 1726853370.62715: variable 'ansible_shell_executable' from source: unknown 15494 1726853370.62718: variable 'ansible_connection' from source: unknown 15494 1726853370.62721: variable 'ansible_module_compression' from source: unknown 15494 1726853370.62723: variable 'ansible_shell_type' from source: unknown 15494 1726853370.62725: variable 'ansible_shell_executable' from source: unknown 15494 1726853370.62727: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853370.62759: variable 'ansible_pipelining' from source: unknown 15494 1726853370.62762: variable 'ansible_timeout' from source: unknown 15494 1726853370.62764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853370.62903: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15494 1726853370.62907: variable 'omit' from source: magic vars 15494 1726853370.62910: starting attempt loop 15494 1726853370.62913: running the handler 15494 1726853370.62941: _low_level_execute_command(): starting 15494 1726853370.62951: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853370.63624: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853370.63629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853370.63632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853370.63696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853370.63703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853370.63709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853370.63752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853370.65431: stdout chunk (state=3): >>>/root <<< 15494 1726853370.65534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853370.65585: stderr chunk (state=3): >>><<< 15494 1726853370.65589: stdout chunk (state=3): >>><<< 15494 1726853370.65626: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853370.65633: _low_level_execute_command(): starting 15494 1726853370.65644: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853370.6562018-17238-117340724486335 `" && echo ansible-tmp-1726853370.6562018-17238-117340724486335="` echo /root/.ansible/tmp/ansible-tmp-1726853370.6562018-17238-117340724486335 `" ) && sleep 0' 15494 1726853370.66275: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853370.66340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853370.66352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853370.66393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853370.68277: stdout chunk (state=3): >>>ansible-tmp-1726853370.6562018-17238-117340724486335=/root/.ansible/tmp/ansible-tmp-1726853370.6562018-17238-117340724486335 <<< 15494 1726853370.68441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853370.68447: stdout chunk (state=3): >>><<< 15494 1726853370.68450: stderr chunk (state=3): >>><<< 15494 1726853370.68484: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853370.6562018-17238-117340724486335=/root/.ansible/tmp/ansible-tmp-1726853370.6562018-17238-117340724486335 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853370.68530: variable 'ansible_module_compression' from source: unknown 15494 1726853370.68582: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15494 1726853370.68620: variable 'ansible_facts' from source: unknown 15494 1726853370.68699: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853370.6562018-17238-117340724486335/AnsiballZ_stat.py 15494 1726853370.68823: Sending initial data 15494 1726853370.68826: Sent initial data (153 bytes) 15494 1726853370.69355: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853370.69358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853370.69361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853370.69387: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853370.69446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853370.69464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853370.69512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853370.71038: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15494 1726853370.71043: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853370.71070: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853370.71111: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpdhglwqk_ /root/.ansible/tmp/ansible-tmp-1726853370.6562018-17238-117340724486335/AnsiballZ_stat.py <<< 15494 1726853370.71114: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853370.6562018-17238-117340724486335/AnsiballZ_stat.py" <<< 15494 1726853370.71167: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpdhglwqk_" to remote "/root/.ansible/tmp/ansible-tmp-1726853370.6562018-17238-117340724486335/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853370.6562018-17238-117340724486335/AnsiballZ_stat.py" <<< 15494 1726853370.71757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853370.71835: stderr chunk (state=3): >>><<< 15494 1726853370.71838: stdout chunk (state=3): >>><<< 15494 1726853370.71841: done transferring module to remote 15494 1726853370.71843: _low_level_execute_command(): starting 15494 1726853370.71850: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853370.6562018-17238-117340724486335/ /root/.ansible/tmp/ansible-tmp-1726853370.6562018-17238-117340724486335/AnsiballZ_stat.py && sleep 0' 15494 1726853370.72354: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853370.72358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853370.72360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853370.72362: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853370.72365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853370.72366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853370.72409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853370.72416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853370.72468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853370.74216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853370.74244: stderr chunk (state=3): >>><<< 15494 1726853370.74247: stdout chunk (state=3): >>><<< 15494 1726853370.74262: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853370.74265: _low_level_execute_command(): starting 15494 1726853370.74268: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853370.6562018-17238-117340724486335/AnsiballZ_stat.py && sleep 0' 15494 1726853370.74721: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853370.74724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853370.74726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853370.74728: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853370.74730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853370.74825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853370.74875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853370.89784: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15494 1726853370.91061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853370.91065: stdout chunk (state=3): >>><<< 15494 1726853370.91067: stderr chunk (state=3): >>><<< 15494 1726853370.91088: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853370.91122: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853370.6562018-17238-117340724486335/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853370.91166: _low_level_execute_command(): starting 15494 1726853370.91169: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853370.6562018-17238-117340724486335/ > /dev/null 2>&1 && sleep 0' 15494 1726853370.91748: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853370.91761: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853370.91778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853370.91793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853370.91807: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853370.91902: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853370.91934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853370.92004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853370.93833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853370.93892: stderr chunk (state=3): >>><<< 15494 1726853370.93904: stdout chunk (state=3): >>><<< 15494 1726853370.93923: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853370.93935: handler run complete 15494 1726853370.93958: attempt loop complete, returning result 15494 1726853370.93964: _execute() done 15494 1726853370.94074: dumping result to json 15494 1726853370.94078: done dumping result, returning 15494 1726853370.94080: done running TaskExecutor() for managed_node1/TASK: Get stat for interface LSR-TST-br31 [02083763-bbaf-0028-1a50-0000000004e1] 15494 1726853370.94082: sending task result for task 02083763-bbaf-0028-1a50-0000000004e1 15494 1726853370.94146: done sending task result for task 02083763-bbaf-0028-1a50-0000000004e1 15494 1726853370.94148: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 15494 1726853370.94207: no more pending results, returning what we have 15494 1726853370.94211: results queue empty 15494 1726853370.94212: checking for any_errors_fatal 15494 1726853370.94214: done checking for any_errors_fatal 15494 1726853370.94214: checking for max_fail_percentage 15494 1726853370.94216: done checking for max_fail_percentage 15494 1726853370.94217: checking to see if all hosts have failed and the running result is not ok 15494 1726853370.94218: done checking to see if all hosts have failed 15494 1726853370.94218: getting the remaining hosts for this loop 15494 1726853370.94221: done getting the remaining hosts for this loop 15494 1726853370.94225: getting the next task for host managed_node1 15494 1726853370.94234: done getting next task for host managed_node1 15494 1726853370.94237: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 15494 1726853370.94240: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853370.94244: getting variables 15494 1726853370.94246: in VariableManager get_vars() 15494 1726853370.94275: Calling all_inventory to load vars for managed_node1 15494 1726853370.94278: Calling groups_inventory to load vars for managed_node1 15494 1726853370.94281: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853370.94291: Calling all_plugins_play to load vars for managed_node1 15494 1726853370.94294: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853370.94296: Calling groups_plugins_play to load vars for managed_node1 15494 1726853370.96013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853370.98027: done with get_vars() 15494 1726853370.98048: done getting variables 15494 1726853370.98108: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15494 1726853370.98231: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 13:29:30 -0400 (0:00:00.376) 0:00:39.598 ****** 15494 1726853370.98261: entering _queue_task() for managed_node1/assert 15494 1726853370.98698: worker is 1 (out of 1 available) 15494 1726853370.98708: exiting _queue_task() for managed_node1/assert 15494 1726853370.98719: done queuing things up, now waiting for results queue to drain 15494 1726853370.98720: waiting for pending results... 15494 1726853370.98959: running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'LSR-TST-br31' 15494 1726853370.99056: in run() - task 02083763-bbaf-0028-1a50-0000000004d7 15494 1726853370.99061: variable 'ansible_search_path' from source: unknown 15494 1726853370.99065: variable 'ansible_search_path' from source: unknown 15494 1726853370.99073: calling self._execute() 15494 1726853370.99184: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853370.99196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853370.99211: variable 'omit' from source: magic vars 15494 1726853370.99601: variable 'ansible_distribution_major_version' from source: facts 15494 1726853370.99620: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853370.99707: variable 'omit' from source: magic vars 15494 1726853370.99711: variable 'omit' from source: magic vars 15494 1726853370.99787: variable 'interface' from source: set_fact 15494 1726853370.99820: variable 'omit' from source: magic vars 15494 1726853370.99869: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853370.99915: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853370.99951: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853370.99978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853370.99997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853371.00041: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853371.00051: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853371.00060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853371.00252: Set connection var ansible_connection to ssh 15494 1726853371.00256: Set connection var ansible_pipelining to False 15494 1726853371.00258: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853371.00261: Set connection var ansible_shell_type to sh 15494 1726853371.00263: Set connection var ansible_timeout to 10 15494 1726853371.00265: Set connection var ansible_shell_executable to /bin/sh 15494 1726853371.00267: variable 'ansible_shell_executable' from source: unknown 15494 1726853371.00270: variable 'ansible_connection' from source: unknown 15494 1726853371.00274: variable 'ansible_module_compression' from source: unknown 15494 1726853371.00276: variable 'ansible_shell_type' from source: unknown 15494 1726853371.00278: variable 'ansible_shell_executable' from source: unknown 15494 1726853371.00280: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853371.00283: variable 'ansible_pipelining' from source: unknown 15494 1726853371.00285: variable 'ansible_timeout' from source: unknown 15494 1726853371.00294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853371.00441: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853371.00459: variable 'omit' from source: magic vars 15494 1726853371.00482: starting attempt loop 15494 1726853371.00489: running the handler 15494 1726853371.00636: variable 'interface_stat' from source: set_fact 15494 1726853371.00687: Evaluated conditional (not interface_stat.stat.exists): True 15494 1726853371.00690: handler run complete 15494 1726853371.00693: attempt loop complete, returning result 15494 1726853371.00695: _execute() done 15494 1726853371.00698: dumping result to json 15494 1726853371.00700: done dumping result, returning 15494 1726853371.00711: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is absent - 'LSR-TST-br31' [02083763-bbaf-0028-1a50-0000000004d7] 15494 1726853371.00722: sending task result for task 02083763-bbaf-0028-1a50-0000000004d7 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 15494 1726853371.00945: no more pending results, returning what we have 15494 1726853371.00949: results queue empty 15494 1726853371.00949: checking for any_errors_fatal 15494 1726853371.00959: done checking for any_errors_fatal 15494 1726853371.00960: checking for max_fail_percentage 15494 1726853371.00961: done checking for max_fail_percentage 15494 1726853371.00962: checking to see if all hosts have failed and the running result is not ok 15494 1726853371.00963: done checking to see if all hosts have failed 15494 1726853371.00964: getting the remaining hosts for this loop 15494 1726853371.00965: done getting the remaining hosts for this loop 15494 1726853371.00968: getting the next task for host managed_node1 15494 1726853371.00978: done getting next task for host managed_node1 15494 1726853371.00981: ^ task is: TASK: meta (flush_handlers) 15494 1726853371.00982: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853371.00986: getting variables 15494 1726853371.00988: in VariableManager get_vars() 15494 1726853371.01017: Calling all_inventory to load vars for managed_node1 15494 1726853371.01019: Calling groups_inventory to load vars for managed_node1 15494 1726853371.01023: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853371.01034: Calling all_plugins_play to load vars for managed_node1 15494 1726853371.01037: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853371.01040: Calling groups_plugins_play to load vars for managed_node1 15494 1726853371.01587: done sending task result for task 02083763-bbaf-0028-1a50-0000000004d7 15494 1726853371.01591: WORKER PROCESS EXITING 15494 1726853371.02638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853371.04296: done with get_vars() 15494 1726853371.04319: done getting variables 15494 1726853371.04387: in VariableManager get_vars() 15494 1726853371.04400: Calling all_inventory to load vars for managed_node1 15494 1726853371.04402: Calling groups_inventory to load vars for managed_node1 15494 1726853371.04404: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853371.04408: Calling all_plugins_play to load vars for managed_node1 15494 1726853371.04410: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853371.04412: Calling groups_plugins_play to load vars for managed_node1 15494 1726853371.05715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853371.07327: done with get_vars() 15494 1726853371.07353: done queuing things up, now waiting for results queue to drain 15494 1726853371.07355: results queue empty 15494 1726853371.07356: checking for any_errors_fatal 15494 1726853371.07358: done checking for any_errors_fatal 15494 1726853371.07359: checking for max_fail_percentage 15494 1726853371.07359: done checking for max_fail_percentage 15494 1726853371.07360: checking to see if all hosts have failed and the running result is not ok 15494 1726853371.07361: done checking to see if all hosts have failed 15494 1726853371.07367: getting the remaining hosts for this loop 15494 1726853371.07368: done getting the remaining hosts for this loop 15494 1726853371.07372: getting the next task for host managed_node1 15494 1726853371.07376: done getting next task for host managed_node1 15494 1726853371.07377: ^ task is: TASK: meta (flush_handlers) 15494 1726853371.07379: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853371.07381: getting variables 15494 1726853371.07382: in VariableManager get_vars() 15494 1726853371.07390: Calling all_inventory to load vars for managed_node1 15494 1726853371.07393: Calling groups_inventory to load vars for managed_node1 15494 1726853371.07395: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853371.07400: Calling all_plugins_play to load vars for managed_node1 15494 1726853371.07402: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853371.07405: Calling groups_plugins_play to load vars for managed_node1 15494 1726853371.08587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853371.10174: done with get_vars() 15494 1726853371.10197: done getting variables 15494 1726853371.10255: in VariableManager get_vars() 15494 1726853371.10266: Calling all_inventory to load vars for managed_node1 15494 1726853371.10268: Calling groups_inventory to load vars for managed_node1 15494 1726853371.10272: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853371.10277: Calling all_plugins_play to load vars for managed_node1 15494 1726853371.10280: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853371.10282: Calling groups_plugins_play to load vars for managed_node1 15494 1726853371.11523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853371.13120: done with get_vars() 15494 1726853371.13146: done queuing things up, now waiting for results queue to drain 15494 1726853371.13148: results queue empty 15494 1726853371.13149: checking for any_errors_fatal 15494 1726853371.13150: done checking for any_errors_fatal 15494 1726853371.13150: checking for max_fail_percentage 15494 1726853371.13151: done checking for max_fail_percentage 15494 1726853371.13152: checking to see if all hosts have failed and the running result is not ok 15494 1726853371.13153: done checking to see if all hosts have failed 15494 1726853371.13153: getting the remaining hosts for this loop 15494 1726853371.13154: done getting the remaining hosts for this loop 15494 1726853371.13157: getting the next task for host managed_node1 15494 1726853371.13165: done getting next task for host managed_node1 15494 1726853371.13166: ^ task is: None 15494 1726853371.13167: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853371.13169: done queuing things up, now waiting for results queue to drain 15494 1726853371.13170: results queue empty 15494 1726853371.13172: checking for any_errors_fatal 15494 1726853371.13173: done checking for any_errors_fatal 15494 1726853371.13173: checking for max_fail_percentage 15494 1726853371.13174: done checking for max_fail_percentage 15494 1726853371.13175: checking to see if all hosts have failed and the running result is not ok 15494 1726853371.13176: done checking to see if all hosts have failed 15494 1726853371.13177: getting the next task for host managed_node1 15494 1726853371.13179: done getting next task for host managed_node1 15494 1726853371.13180: ^ task is: None 15494 1726853371.13181: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853371.13222: in VariableManager get_vars() 15494 1726853371.13237: done with get_vars() 15494 1726853371.13242: in VariableManager get_vars() 15494 1726853371.13252: done with get_vars() 15494 1726853371.13256: variable 'omit' from source: magic vars 15494 1726853371.13294: in VariableManager get_vars() 15494 1726853371.13305: done with get_vars() 15494 1726853371.13325: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 15494 1726853371.13558: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15494 1726853371.13585: getting the remaining hosts for this loop 15494 1726853371.13586: done getting the remaining hosts for this loop 15494 1726853371.13593: getting the next task for host managed_node1 15494 1726853371.13596: done getting next task for host managed_node1 15494 1726853371.13598: ^ task is: TASK: Gathering Facts 15494 1726853371.13599: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853371.13601: getting variables 15494 1726853371.13602: in VariableManager get_vars() 15494 1726853371.13610: Calling all_inventory to load vars for managed_node1 15494 1726853371.13612: Calling groups_inventory to load vars for managed_node1 15494 1726853371.13614: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853371.13619: Calling all_plugins_play to load vars for managed_node1 15494 1726853371.13622: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853371.13625: Calling groups_plugins_play to load vars for managed_node1 15494 1726853371.14802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853371.16417: done with get_vars() 15494 1726853371.16435: done getting variables 15494 1726853371.16483: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Friday 20 September 2024 13:29:31 -0400 (0:00:00.182) 0:00:39.780 ****** 15494 1726853371.16508: entering _queue_task() for managed_node1/gather_facts 15494 1726853371.16843: worker is 1 (out of 1 available) 15494 1726853371.16853: exiting _queue_task() for managed_node1/gather_facts 15494 1726853371.16865: done queuing things up, now waiting for results queue to drain 15494 1726853371.16867: waiting for pending results... 15494 1726853371.17142: running TaskExecutor() for managed_node1/TASK: Gathering Facts 15494 1726853371.17277: in run() - task 02083763-bbaf-0028-1a50-0000000004fa 15494 1726853371.17281: variable 'ansible_search_path' from source: unknown 15494 1726853371.17406: calling self._execute() 15494 1726853371.17419: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853371.17430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853371.17443: variable 'omit' from source: magic vars 15494 1726853371.17841: variable 'ansible_distribution_major_version' from source: facts 15494 1726853371.17858: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853371.17869: variable 'omit' from source: magic vars 15494 1726853371.17902: variable 'omit' from source: magic vars 15494 1726853371.17951: variable 'omit' from source: magic vars 15494 1726853371.17998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853371.18038: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853371.18070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853371.18093: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853371.18109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853371.18142: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853371.18151: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853371.18166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853371.18277: Set connection var ansible_connection to ssh 15494 1726853371.18476: Set connection var ansible_pipelining to False 15494 1726853371.18479: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853371.18482: Set connection var ansible_shell_type to sh 15494 1726853371.18484: Set connection var ansible_timeout to 10 15494 1726853371.18486: Set connection var ansible_shell_executable to /bin/sh 15494 1726853371.18488: variable 'ansible_shell_executable' from source: unknown 15494 1726853371.18490: variable 'ansible_connection' from source: unknown 15494 1726853371.18492: variable 'ansible_module_compression' from source: unknown 15494 1726853371.18495: variable 'ansible_shell_type' from source: unknown 15494 1726853371.18497: variable 'ansible_shell_executable' from source: unknown 15494 1726853371.18499: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853371.18502: variable 'ansible_pipelining' from source: unknown 15494 1726853371.18504: variable 'ansible_timeout' from source: unknown 15494 1726853371.18506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853371.18566: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853371.18585: variable 'omit' from source: magic vars 15494 1726853371.18594: starting attempt loop 15494 1726853371.18601: running the handler 15494 1726853371.18630: variable 'ansible_facts' from source: unknown 15494 1726853371.18654: _low_level_execute_command(): starting 15494 1726853371.18666: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853371.19497: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853371.19525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853371.19541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853371.19562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853371.19636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853371.21324: stdout chunk (state=3): >>>/root <<< 15494 1726853371.21470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853371.21476: stdout chunk (state=3): >>><<< 15494 1726853371.21479: stderr chunk (state=3): >>><<< 15494 1726853371.21496: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853371.21508: _low_level_execute_command(): starting 15494 1726853371.21513: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853371.2149715-17254-71138461946608 `" && echo ansible-tmp-1726853371.2149715-17254-71138461946608="` echo /root/.ansible/tmp/ansible-tmp-1726853371.2149715-17254-71138461946608 `" ) && sleep 0' 15494 1726853371.21927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853371.21931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853371.21934: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853371.21943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853371.21988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853371.21992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853371.22036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853371.23898: stdout chunk (state=3): >>>ansible-tmp-1726853371.2149715-17254-71138461946608=/root/.ansible/tmp/ansible-tmp-1726853371.2149715-17254-71138461946608 <<< 15494 1726853371.24025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853371.24029: stdout chunk (state=3): >>><<< 15494 1726853371.24035: stderr chunk (state=3): >>><<< 15494 1726853371.24050: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853371.2149715-17254-71138461946608=/root/.ansible/tmp/ansible-tmp-1726853371.2149715-17254-71138461946608 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853371.24077: variable 'ansible_module_compression' from source: unknown 15494 1726853371.24115: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15494 1726853371.24160: variable 'ansible_facts' from source: unknown 15494 1726853371.24288: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853371.2149715-17254-71138461946608/AnsiballZ_setup.py 15494 1726853371.24385: Sending initial data 15494 1726853371.24389: Sent initial data (153 bytes) 15494 1726853371.24804: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853371.24808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853371.24810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15494 1726853371.24812: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853371.24814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853371.24864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853371.24873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853371.24907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853371.26472: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853371.26499: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853371.26546: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpd8ms692k /root/.ansible/tmp/ansible-tmp-1726853371.2149715-17254-71138461946608/AnsiballZ_setup.py <<< 15494 1726853371.26550: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853371.2149715-17254-71138461946608/AnsiballZ_setup.py" <<< 15494 1726853371.26605: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpd8ms692k" to remote "/root/.ansible/tmp/ansible-tmp-1726853371.2149715-17254-71138461946608/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853371.2149715-17254-71138461946608/AnsiballZ_setup.py" <<< 15494 1726853371.28093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853371.28135: stderr chunk (state=3): >>><<< 15494 1726853371.28190: stdout chunk (state=3): >>><<< 15494 1726853371.28193: done transferring module to remote 15494 1726853371.28196: _low_level_execute_command(): starting 15494 1726853371.28200: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853371.2149715-17254-71138461946608/ /root/.ansible/tmp/ansible-tmp-1726853371.2149715-17254-71138461946608/AnsiballZ_setup.py && sleep 0' 15494 1726853371.28853: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853371.28869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853371.28958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853371.28983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853371.29063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853371.30936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853371.30940: stdout chunk (state=3): >>><<< 15494 1726853371.30942: stderr chunk (state=3): >>><<< 15494 1726853371.31039: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853371.31043: _low_level_execute_command(): starting 15494 1726853371.31045: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853371.2149715-17254-71138461946608/AnsiballZ_setup.py && sleep 0' 15494 1726853371.31592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853371.31605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853371.31628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853371.31646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853371.31741: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853371.31761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853371.31777: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853371.31801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853371.31875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853371.96940: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.47509765625, "5m": 0.3486328125, "15m": 0.158203125}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "31", "epoch": "1726853371", "epoch_int": "1726853371", "date": "2024-09-20", "time": "13:29:31", "iso8601_micro": "2024-09-20T17:29:31.631878Z", "iso8601": "2024-09-20T17:29:31Z", "iso8601_basic": "20240920T132931631878", "iso8601_basic_short": "20240920T132931", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3293, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 537, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794951168, "block_size": 4096, "block_total": 65519099, "block_available": 63914783, "block_used": 1604316, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15494 1726853371.99091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853371.99095: stdout chunk (state=3): >>><<< 15494 1726853371.99097: stderr chunk (state=3): >>><<< 15494 1726853371.99179: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-45-153", "ansible_nodename": "ip-10-31-45-153.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec26b9e88796a7cb9ebdc2656ce384f6", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCXYnrsBaUY4i0/t1QUWoZkFPnFbGncbkmF01/zUZNuwldCwqYoDxpR2K8ARraEuK9oVLyYO0alCszdGP42db6R4xfRCOhN3faeZXsneupsJk4LLpIBkq0uIokeAtcL1tPOUQQzfsQqzZzp4BmJCVrwmUW5ADnzqCgvB3gsruyTQUrEUJ9MtB5zdaQm5MXuipjeZQThTjYCB2aXv/qTdzfKAwds3CoSZ6HA5GNdi6tahsy3CRIn6VtVkvwrqjJcwo+RaRQzjh+C9AUoH2YQmfLbvog62MsnLk/5OPq5HhxO81pm/TJDsI4LXwLh1VviMOWzVvIaXuKwdmYAdgX1NU561bBzeYzi55qBKo4TcMmnOXiV+Es7dDrKjwwpQKsv5tjSqVkcO6ek3I6SI38DXFOBLZtqXOOLsO12iOReYJUWe/+cgBrz12kDCPmaHFzFFZ3+N0GQ/WiYcgqiUItIhb3xJTbPqls0czPCpCzKo57GyTmv17fpfGhBfRoGX/H1zYs=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDOnt+7F+RlMaCRRXs8YyuhiHP1FmeDlj4rNB/K2mg1iP9loXXc/XjJ083xMBDu7m7uYLGB+dnmj299Y+RcAQpE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIKmxoIJtw8UORlc+o+Q7Pks5ERSyhMLnl+Oo8W221WGj", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.47509765625, "5m": 0.3486328125, "15m": 0.158203125}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 44238 10.31.45.153 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 44238 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::3a:e7ff:fe40:bc9f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.45.153", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:3a:e7:40:bc:9f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.45.153"], "ansible_all_ipv6_addresses": ["fe80::3a:e7ff:fe40:bc9f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.45.153", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::3a:e7ff:fe40:bc9f"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "29", "second": "31", "epoch": "1726853371", "epoch_int": "1726853371", "date": "2024-09-20", "time": "13:29:31", "iso8601_micro": "2024-09-20T17:29:31.631878Z", "iso8601": "2024-09-20T17:29:31Z", "iso8601_basic": "20240920T132931631878", "iso8601_basic_short": "20240920T132931", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2956, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 575, "free": 2956}, "nocache": {"free": 3293, "used": 238}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_uuid": "ec26b9e8-8796-a7cb-9ebd-c2656ce384f6", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 537, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794951168, "block_size": 4096, "block_total": 65519099, "block_available": 63914783, "block_used": 1604316, "inode_total": 131070960, "inode_available": 131029067, "inode_used": 41893, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853371.99972: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853371.2149715-17254-71138461946608/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853372.00005: _low_level_execute_command(): starting 15494 1726853372.00062: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853371.2149715-17254-71138461946608/ > /dev/null 2>&1 && sleep 0' 15494 1726853372.01195: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853372.01215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853372.01254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853372.01257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found <<< 15494 1726853372.01359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853372.01398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853372.01457: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853372.01598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853372.03414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853372.03480: stderr chunk (state=3): >>><<< 15494 1726853372.03483: stdout chunk (state=3): >>><<< 15494 1726853372.03489: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853372.03497: handler run complete 15494 1726853372.03593: variable 'ansible_facts' from source: unknown 15494 1726853372.03673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853372.03881: variable 'ansible_facts' from source: unknown 15494 1726853372.03944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853372.04015: attempt loop complete, returning result 15494 1726853372.04018: _execute() done 15494 1726853372.04020: dumping result to json 15494 1726853372.04041: done dumping result, returning 15494 1726853372.04050: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [02083763-bbaf-0028-1a50-0000000004fa] 15494 1726853372.04053: sending task result for task 02083763-bbaf-0028-1a50-0000000004fa 15494 1726853372.04475: done sending task result for task 02083763-bbaf-0028-1a50-0000000004fa 15494 1726853372.04479: WORKER PROCESS EXITING ok: [managed_node1] 15494 1726853372.04696: no more pending results, returning what we have 15494 1726853372.04698: results queue empty 15494 1726853372.04699: checking for any_errors_fatal 15494 1726853372.04700: done checking for any_errors_fatal 15494 1726853372.04701: checking for max_fail_percentage 15494 1726853372.04702: done checking for max_fail_percentage 15494 1726853372.04703: checking to see if all hosts have failed and the running result is not ok 15494 1726853372.04703: done checking to see if all hosts have failed 15494 1726853372.04704: getting the remaining hosts for this loop 15494 1726853372.04705: done getting the remaining hosts for this loop 15494 1726853372.04707: getting the next task for host managed_node1 15494 1726853372.04711: done getting next task for host managed_node1 15494 1726853372.04712: ^ task is: TASK: meta (flush_handlers) 15494 1726853372.04713: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853372.04715: getting variables 15494 1726853372.04716: in VariableManager get_vars() 15494 1726853372.04732: Calling all_inventory to load vars for managed_node1 15494 1726853372.04734: Calling groups_inventory to load vars for managed_node1 15494 1726853372.04736: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853372.04743: Calling all_plugins_play to load vars for managed_node1 15494 1726853372.04745: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853372.04747: Calling groups_plugins_play to load vars for managed_node1 15494 1726853372.06195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853372.07383: done with get_vars() 15494 1726853372.07398: done getting variables 15494 1726853372.07448: in VariableManager get_vars() 15494 1726853372.07456: Calling all_inventory to load vars for managed_node1 15494 1726853372.07458: Calling groups_inventory to load vars for managed_node1 15494 1726853372.07461: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853372.07466: Calling all_plugins_play to load vars for managed_node1 15494 1726853372.07468: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853372.07473: Calling groups_plugins_play to load vars for managed_node1 15494 1726853372.08405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853372.10072: done with get_vars() 15494 1726853372.10104: done queuing things up, now waiting for results queue to drain 15494 1726853372.10107: results queue empty 15494 1726853372.10108: checking for any_errors_fatal 15494 1726853372.10111: done checking for any_errors_fatal 15494 1726853372.10112: checking for max_fail_percentage 15494 1726853372.10117: done checking for max_fail_percentage 15494 1726853372.10118: checking to see if all hosts have failed and the running result is not ok 15494 1726853372.10119: done checking to see if all hosts have failed 15494 1726853372.10120: getting the remaining hosts for this loop 15494 1726853372.10121: done getting the remaining hosts for this loop 15494 1726853372.10124: getting the next task for host managed_node1 15494 1726853372.10129: done getting next task for host managed_node1 15494 1726853372.10132: ^ task is: TASK: Verify network state restored to default 15494 1726853372.10133: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853372.10136: getting variables 15494 1726853372.10137: in VariableManager get_vars() 15494 1726853372.10145: Calling all_inventory to load vars for managed_node1 15494 1726853372.10147: Calling groups_inventory to load vars for managed_node1 15494 1726853372.10149: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853372.10154: Calling all_plugins_play to load vars for managed_node1 15494 1726853372.10156: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853372.10159: Calling groups_plugins_play to load vars for managed_node1 15494 1726853372.11340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853372.12935: done with get_vars() 15494 1726853372.12958: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:67 Friday 20 September 2024 13:29:32 -0400 (0:00:00.965) 0:00:40.746 ****** 15494 1726853372.13038: entering _queue_task() for managed_node1/include_tasks 15494 1726853372.13415: worker is 1 (out of 1 available) 15494 1726853372.13428: exiting _queue_task() for managed_node1/include_tasks 15494 1726853372.13441: done queuing things up, now waiting for results queue to drain 15494 1726853372.13443: waiting for pending results... 15494 1726853372.13757: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 15494 1726853372.13803: in run() - task 02083763-bbaf-0028-1a50-00000000007a 15494 1726853372.13823: variable 'ansible_search_path' from source: unknown 15494 1726853372.13870: calling self._execute() 15494 1726853372.13975: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853372.13987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853372.14002: variable 'omit' from source: magic vars 15494 1726853372.14377: variable 'ansible_distribution_major_version' from source: facts 15494 1726853372.14476: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853372.14480: _execute() done 15494 1726853372.14483: dumping result to json 15494 1726853372.14485: done dumping result, returning 15494 1726853372.14487: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [02083763-bbaf-0028-1a50-00000000007a] 15494 1726853372.14490: sending task result for task 02083763-bbaf-0028-1a50-00000000007a 15494 1726853372.14563: done sending task result for task 02083763-bbaf-0028-1a50-00000000007a 15494 1726853372.14566: WORKER PROCESS EXITING 15494 1726853372.14623: no more pending results, returning what we have 15494 1726853372.14630: in VariableManager get_vars() 15494 1726853372.14664: Calling all_inventory to load vars for managed_node1 15494 1726853372.14666: Calling groups_inventory to load vars for managed_node1 15494 1726853372.14670: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853372.14685: Calling all_plugins_play to load vars for managed_node1 15494 1726853372.14688: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853372.14690: Calling groups_plugins_play to load vars for managed_node1 15494 1726853372.16112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853372.17728: done with get_vars() 15494 1726853372.17746: variable 'ansible_search_path' from source: unknown 15494 1726853372.17765: we have included files to process 15494 1726853372.17767: generating all_blocks data 15494 1726853372.17768: done generating all_blocks data 15494 1726853372.17769: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15494 1726853372.17770: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15494 1726853372.17774: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15494 1726853372.18161: done processing included file 15494 1726853372.18163: iterating over new_blocks loaded from include file 15494 1726853372.18164: in VariableManager get_vars() 15494 1726853372.18177: done with get_vars() 15494 1726853372.18179: filtering new block on tags 15494 1726853372.18200: done filtering new block on tags 15494 1726853372.18203: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 15494 1726853372.18208: extending task lists for all hosts with included blocks 15494 1726853372.18240: done extending task lists 15494 1726853372.18241: done processing included files 15494 1726853372.18242: results queue empty 15494 1726853372.18242: checking for any_errors_fatal 15494 1726853372.18244: done checking for any_errors_fatal 15494 1726853372.18244: checking for max_fail_percentage 15494 1726853372.18245: done checking for max_fail_percentage 15494 1726853372.18246: checking to see if all hosts have failed and the running result is not ok 15494 1726853372.18247: done checking to see if all hosts have failed 15494 1726853372.18247: getting the remaining hosts for this loop 15494 1726853372.18249: done getting the remaining hosts for this loop 15494 1726853372.18251: getting the next task for host managed_node1 15494 1726853372.18255: done getting next task for host managed_node1 15494 1726853372.18257: ^ task is: TASK: Check routes and DNS 15494 1726853372.18259: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853372.18261: getting variables 15494 1726853372.18262: in VariableManager get_vars() 15494 1726853372.18270: Calling all_inventory to load vars for managed_node1 15494 1726853372.18273: Calling groups_inventory to load vars for managed_node1 15494 1726853372.18276: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853372.18281: Calling all_plugins_play to load vars for managed_node1 15494 1726853372.18283: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853372.18286: Calling groups_plugins_play to load vars for managed_node1 15494 1726853372.19458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853372.21099: done with get_vars() 15494 1726853372.21118: done getting variables 15494 1726853372.21157: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 13:29:32 -0400 (0:00:00.081) 0:00:40.827 ****** 15494 1726853372.21194: entering _queue_task() for managed_node1/shell 15494 1726853372.21647: worker is 1 (out of 1 available) 15494 1726853372.21658: exiting _queue_task() for managed_node1/shell 15494 1726853372.21669: done queuing things up, now waiting for results queue to drain 15494 1726853372.21673: waiting for pending results... 15494 1726853372.21911: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 15494 1726853372.21977: in run() - task 02083763-bbaf-0028-1a50-00000000050b 15494 1726853372.21997: variable 'ansible_search_path' from source: unknown 15494 1726853372.22061: variable 'ansible_search_path' from source: unknown 15494 1726853372.22065: calling self._execute() 15494 1726853372.22152: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853372.22169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853372.22187: variable 'omit' from source: magic vars 15494 1726853372.22529: variable 'ansible_distribution_major_version' from source: facts 15494 1726853372.22539: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853372.22545: variable 'omit' from source: magic vars 15494 1726853372.22585: variable 'omit' from source: magic vars 15494 1726853372.22611: variable 'omit' from source: magic vars 15494 1726853372.22643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853372.22673: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853372.22690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853372.22703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853372.22715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853372.22737: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853372.22740: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853372.22744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853372.22814: Set connection var ansible_connection to ssh 15494 1726853372.22819: Set connection var ansible_pipelining to False 15494 1726853372.22825: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853372.22828: Set connection var ansible_shell_type to sh 15494 1726853372.22831: Set connection var ansible_timeout to 10 15494 1726853372.22840: Set connection var ansible_shell_executable to /bin/sh 15494 1726853372.22856: variable 'ansible_shell_executable' from source: unknown 15494 1726853372.22859: variable 'ansible_connection' from source: unknown 15494 1726853372.22862: variable 'ansible_module_compression' from source: unknown 15494 1726853372.22865: variable 'ansible_shell_type' from source: unknown 15494 1726853372.22867: variable 'ansible_shell_executable' from source: unknown 15494 1726853372.22869: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853372.22876: variable 'ansible_pipelining' from source: unknown 15494 1726853372.22878: variable 'ansible_timeout' from source: unknown 15494 1726853372.22881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853372.22981: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853372.22990: variable 'omit' from source: magic vars 15494 1726853372.22995: starting attempt loop 15494 1726853372.22998: running the handler 15494 1726853372.23007: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853372.23021: _low_level_execute_command(): starting 15494 1726853372.23029: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853372.23517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853372.23521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration <<< 15494 1726853372.23526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853372.23581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853372.23585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853372.23633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853372.25412: stdout chunk (state=3): >>>/root <<< 15494 1726853372.25430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853372.25453: stderr chunk (state=3): >>><<< 15494 1726853372.25477: stdout chunk (state=3): >>><<< 15494 1726853372.25502: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853372.25547: _low_level_execute_command(): starting 15494 1726853372.25551: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853372.255108-17293-170842436599957 `" && echo ansible-tmp-1726853372.255108-17293-170842436599957="` echo /root/.ansible/tmp/ansible-tmp-1726853372.255108-17293-170842436599957 `" ) && sleep 0' 15494 1726853372.26079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853372.26082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853372.26094: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853372.26106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853372.26150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853372.26154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853372.26195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853372.28108: stdout chunk (state=3): >>>ansible-tmp-1726853372.255108-17293-170842436599957=/root/.ansible/tmp/ansible-tmp-1726853372.255108-17293-170842436599957 <<< 15494 1726853372.28264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853372.28267: stdout chunk (state=3): >>><<< 15494 1726853372.28269: stderr chunk (state=3): >>><<< 15494 1726853372.28500: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853372.255108-17293-170842436599957=/root/.ansible/tmp/ansible-tmp-1726853372.255108-17293-170842436599957 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853372.28503: variable 'ansible_module_compression' from source: unknown 15494 1726853372.28505: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15494 1726853372.28507: variable 'ansible_facts' from source: unknown 15494 1726853372.28556: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853372.255108-17293-170842436599957/AnsiballZ_command.py 15494 1726853372.28712: Sending initial data 15494 1726853372.28716: Sent initial data (155 bytes) 15494 1726853372.29388: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853372.29393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853372.29416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853372.30950: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853372.30992: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853372.31123: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpd3edn8fh /root/.ansible/tmp/ansible-tmp-1726853372.255108-17293-170842436599957/AnsiballZ_command.py <<< 15494 1726853372.31128: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853372.255108-17293-170842436599957/AnsiballZ_command.py" <<< 15494 1726853372.31137: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpd3edn8fh" to remote "/root/.ansible/tmp/ansible-tmp-1726853372.255108-17293-170842436599957/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853372.255108-17293-170842436599957/AnsiballZ_command.py" <<< 15494 1726853372.32241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853372.32277: stderr chunk (state=3): >>><<< 15494 1726853372.32291: stdout chunk (state=3): >>><<< 15494 1726853372.32335: done transferring module to remote 15494 1726853372.32338: _low_level_execute_command(): starting 15494 1726853372.32340: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853372.255108-17293-170842436599957/ /root/.ansible/tmp/ansible-tmp-1726853372.255108-17293-170842436599957/AnsiballZ_command.py && sleep 0' 15494 1726853372.33090: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853372.33126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853372.33134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853372.33143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853372.33232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853372.34989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853372.35033: stderr chunk (state=3): >>><<< 15494 1726853372.35036: stdout chunk (state=3): >>><<< 15494 1726853372.35051: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853372.35131: _low_level_execute_command(): starting 15494 1726853372.35135: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853372.255108-17293-170842436599957/AnsiballZ_command.py && sleep 0' 15494 1726853372.35669: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853372.35688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853372.35707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853372.35724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853372.35791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853372.35841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853372.35859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853372.35882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853372.35957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853372.51807: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:3a:e7:40:bc:9f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.153/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3076sec preferred_lft 3076sec\n inet6 fe80::3a:e7ff:fe40:bc9f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.153 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.153 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:29:32.508414", "end": "2024-09-20 13:29:32.517073", "delta": "0:00:00.008659", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15494 1726853372.53546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853372.53550: stdout chunk (state=3): >>><<< 15494 1726853372.53553: stderr chunk (state=3): >>><<< 15494 1726853372.53575: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:3a:e7:40:bc:9f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.45.153/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3076sec preferred_lft 3076sec\n inet6 fe80::3a:e7ff:fe40:bc9f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.153 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.153 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:29:32.508414", "end": "2024-09-20 13:29:32.517073", "delta": "0:00:00.008659", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853372.53717: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853372.255108-17293-170842436599957/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853372.53733: _low_level_execute_command(): starting 15494 1726853372.53808: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853372.255108-17293-170842436599957/ > /dev/null 2>&1 && sleep 0' 15494 1726853372.54325: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853372.54338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853372.54351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853372.54379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853372.54398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853372.54491: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853372.54688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853372.54890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853372.57177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853372.57181: stdout chunk (state=3): >>><<< 15494 1726853372.57184: stderr chunk (state=3): >>><<< 15494 1726853372.57186: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853372.57189: handler run complete 15494 1726853372.57191: Evaluated conditional (False): False 15494 1726853372.57192: attempt loop complete, returning result 15494 1726853372.57194: _execute() done 15494 1726853372.57196: dumping result to json 15494 1726853372.57198: done dumping result, returning 15494 1726853372.57200: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [02083763-bbaf-0028-1a50-00000000050b] 15494 1726853372.57202: sending task result for task 02083763-bbaf-0028-1a50-00000000050b 15494 1726853372.57285: done sending task result for task 02083763-bbaf-0028-1a50-00000000050b 15494 1726853372.57288: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008659", "end": "2024-09-20 13:29:32.517073", "rc": 0, "start": "2024-09-20 13:29:32.508414" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:3a:e7:40:bc:9f brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.45.153/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 3076sec preferred_lft 3076sec inet6 fe80::3a:e7ff:fe40:bc9f/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.45.153 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.45.153 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 15494 1726853372.57365: no more pending results, returning what we have 15494 1726853372.57369: results queue empty 15494 1726853372.57372: checking for any_errors_fatal 15494 1726853372.57375: done checking for any_errors_fatal 15494 1726853372.57375: checking for max_fail_percentage 15494 1726853372.57377: done checking for max_fail_percentage 15494 1726853372.57379: checking to see if all hosts have failed and the running result is not ok 15494 1726853372.57380: done checking to see if all hosts have failed 15494 1726853372.57381: getting the remaining hosts for this loop 15494 1726853372.57382: done getting the remaining hosts for this loop 15494 1726853372.57386: getting the next task for host managed_node1 15494 1726853372.57394: done getting next task for host managed_node1 15494 1726853372.57396: ^ task is: TASK: Verify DNS and network connectivity 15494 1726853372.57399: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853372.57404: getting variables 15494 1726853372.57405: in VariableManager get_vars() 15494 1726853372.57439: Calling all_inventory to load vars for managed_node1 15494 1726853372.57446: Calling groups_inventory to load vars for managed_node1 15494 1726853372.57450: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853372.57463: Calling all_plugins_play to load vars for managed_node1 15494 1726853372.57467: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853372.57470: Calling groups_plugins_play to load vars for managed_node1 15494 1726853372.60505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853372.63707: done with get_vars() 15494 1726853372.63734: done getting variables 15494 1726853372.63997: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 13:29:32 -0400 (0:00:00.428) 0:00:41.256 ****** 15494 1726853372.64026: entering _queue_task() for managed_node1/shell 15494 1726853372.64629: worker is 1 (out of 1 available) 15494 1726853372.64642: exiting _queue_task() for managed_node1/shell 15494 1726853372.64656: done queuing things up, now waiting for results queue to drain 15494 1726853372.64657: waiting for pending results... 15494 1726853372.65386: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 15494 1726853372.65390: in run() - task 02083763-bbaf-0028-1a50-00000000050c 15494 1726853372.65393: variable 'ansible_search_path' from source: unknown 15494 1726853372.65396: variable 'ansible_search_path' from source: unknown 15494 1726853372.65398: calling self._execute() 15494 1726853372.65665: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853372.65679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853372.65693: variable 'omit' from source: magic vars 15494 1726853372.66461: variable 'ansible_distribution_major_version' from source: facts 15494 1726853372.66479: Evaluated conditional (ansible_distribution_major_version != '6'): True 15494 1726853372.66976: variable 'ansible_facts' from source: unknown 15494 1726853372.68638: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 15494 1726853372.69177: variable 'omit' from source: magic vars 15494 1726853372.69180: variable 'omit' from source: magic vars 15494 1726853372.69182: variable 'omit' from source: magic vars 15494 1726853372.69203: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15494 1726853372.69241: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15494 1726853372.69290: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15494 1726853372.69395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853372.69410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15494 1726853372.69608: variable 'inventory_hostname' from source: host vars for 'managed_node1' 15494 1726853372.69616: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853372.69623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853372.69776: Set connection var ansible_connection to ssh 15494 1726853372.69942: Set connection var ansible_pipelining to False 15494 1726853372.69986: Set connection var ansible_module_compression to ZIP_DEFLATED 15494 1726853372.69993: Set connection var ansible_shell_type to sh 15494 1726853372.70003: Set connection var ansible_timeout to 10 15494 1726853372.70013: Set connection var ansible_shell_executable to /bin/sh 15494 1726853372.70044: variable 'ansible_shell_executable' from source: unknown 15494 1726853372.70376: variable 'ansible_connection' from source: unknown 15494 1726853372.70380: variable 'ansible_module_compression' from source: unknown 15494 1726853372.70382: variable 'ansible_shell_type' from source: unknown 15494 1726853372.70384: variable 'ansible_shell_executable' from source: unknown 15494 1726853372.70386: variable 'ansible_host' from source: host vars for 'managed_node1' 15494 1726853372.70388: variable 'ansible_pipelining' from source: unknown 15494 1726853372.70390: variable 'ansible_timeout' from source: unknown 15494 1726853372.70392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 15494 1726853372.70777: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853372.70781: variable 'omit' from source: magic vars 15494 1726853372.70783: starting attempt loop 15494 1726853372.70786: running the handler 15494 1726853372.70788: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15494 1726853372.70791: _low_level_execute_command(): starting 15494 1726853372.70856: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15494 1726853372.72465: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853372.72786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853372.72829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853372.72873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853372.74812: stdout chunk (state=3): >>>/root <<< 15494 1726853372.74863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853372.74876: stdout chunk (state=3): >>><<< 15494 1726853372.74889: stderr chunk (state=3): >>><<< 15494 1726853372.74913: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853372.75166: _low_level_execute_command(): starting 15494 1726853372.75170: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853372.750782-17310-167380348251092 `" && echo ansible-tmp-1726853372.750782-17310-167380348251092="` echo /root/.ansible/tmp/ansible-tmp-1726853372.750782-17310-167380348251092 `" ) && sleep 0' 15494 1726853372.76336: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853372.76377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853372.76486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853372.76489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853372.76538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853372.78423: stdout chunk (state=3): >>>ansible-tmp-1726853372.750782-17310-167380348251092=/root/.ansible/tmp/ansible-tmp-1726853372.750782-17310-167380348251092 <<< 15494 1726853372.78518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853372.78558: stderr chunk (state=3): >>><<< 15494 1726853372.78779: stdout chunk (state=3): >>><<< 15494 1726853372.78784: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853372.750782-17310-167380348251092=/root/.ansible/tmp/ansible-tmp-1726853372.750782-17310-167380348251092 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853372.78787: variable 'ansible_module_compression' from source: unknown 15494 1726853372.78976: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-154949jgbhg1x/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15494 1726853372.78979: variable 'ansible_facts' from source: unknown 15494 1726853372.79127: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853372.750782-17310-167380348251092/AnsiballZ_command.py 15494 1726853372.79456: Sending initial data 15494 1726853372.79459: Sent initial data (155 bytes) 15494 1726853372.81086: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853372.81395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853372.81462: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853372.83001: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15494 1726853372.83028: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15494 1726853372.83066: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15494 1726853372.83115: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpl3_j1kts /root/.ansible/tmp/ansible-tmp-1726853372.750782-17310-167380348251092/AnsiballZ_command.py <<< 15494 1726853372.83124: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853372.750782-17310-167380348251092/AnsiballZ_command.py" <<< 15494 1726853372.83160: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-154949jgbhg1x/tmpl3_j1kts" to remote "/root/.ansible/tmp/ansible-tmp-1726853372.750782-17310-167380348251092/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853372.750782-17310-167380348251092/AnsiballZ_command.py" <<< 15494 1726853372.83827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853372.83899: stderr chunk (state=3): >>><<< 15494 1726853372.83911: stdout chunk (state=3): >>><<< 15494 1726853372.83978: done transferring module to remote 15494 1726853372.83992: _low_level_execute_command(): starting 15494 1726853372.84001: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853372.750782-17310-167380348251092/ /root/.ansible/tmp/ansible-tmp-1726853372.750782-17310-167380348251092/AnsiballZ_command.py && sleep 0' 15494 1726853372.84614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853372.84632: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853372.84646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853372.84665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853372.84685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 <<< 15494 1726853372.84738: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853372.84799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853372.84821: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853372.84897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853372.86674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853372.86709: stdout chunk (state=3): >>><<< 15494 1726853372.86712: stderr chunk (state=3): >>><<< 15494 1726853372.86729: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853372.86815: _low_level_execute_command(): starting 15494 1726853372.86819: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853372.750782-17310-167380348251092/AnsiballZ_command.py && sleep 0' 15494 1726853372.87358: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853372.87368: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853372.87384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15494 1726853372.87399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15494 1726853372.87428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853372.87439: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853372.87489: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853372.87544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' <<< 15494 1726853372.87560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853372.87584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853372.87658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853373.11454: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6941 0 --:--:-- --:--:-- --:--:-- 7093\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 16218 0 --:--:-- --:--:-- --:--:-- 17117", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:29:33.027440", "end": "2024-09-20 13:29:33.113282", "delta": "0:00:00.085842", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15494 1726853373.13123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. <<< 15494 1726853373.13129: stdout chunk (state=3): >>><<< 15494 1726853373.13132: stderr chunk (state=3): >>><<< 15494 1726853373.13163: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6941 0 --:--:-- --:--:-- --:--:-- 7093\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 16218 0 --:--:-- --:--:-- --:--:-- 17117", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:29:33.027440", "end": "2024-09-20 13:29:33.113282", "delta": "0:00:00.085842", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.153 closed. 15494 1726853373.13339: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853372.750782-17310-167380348251092/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15494 1726853373.13345: _low_level_execute_command(): starting 15494 1726853373.13353: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853372.750782-17310-167380348251092/ > /dev/null 2>&1 && sleep 0' 15494 1726853373.14121: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15494 1726853373.14135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15494 1726853373.14202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853373.14217: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15494 1726853373.14292: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.45.153 is address <<< 15494 1726853373.14304: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15494 1726853373.14380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found <<< 15494 1726853373.14433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15494 1726853373.14599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK <<< 15494 1726853373.14672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15494 1726853373.14815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15494 1726853373.16704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15494 1726853373.16707: stdout chunk (state=3): >>><<< 15494 1726853373.16709: stderr chunk (state=3): >>><<< 15494 1726853373.16725: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.153 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.153 originally 10.31.45.153 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a2da574bb2' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15494 1726853373.16735: handler run complete 15494 1726853373.16877: Evaluated conditional (False): False 15494 1726853373.16881: attempt loop complete, returning result 15494 1726853373.16883: _execute() done 15494 1726853373.16886: dumping result to json 15494 1726853373.16892: done dumping result, returning 15494 1726853373.16894: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [02083763-bbaf-0028-1a50-00000000050c] 15494 1726853373.16897: sending task result for task 02083763-bbaf-0028-1a50-00000000050c 15494 1726853373.16992: done sending task result for task 02083763-bbaf-0028-1a50-00000000050c ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.085842", "end": "2024-09-20 13:29:33.113282", "rc": 0, "start": "2024-09-20 13:29:33.027440" } STDOUT: CHECK DNS AND CONNECTIVITY 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 6941 0 --:--:-- --:--:-- --:--:-- 7093 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 16218 0 --:--:-- --:--:-- --:--:-- 17117 15494 1726853373.17201: no more pending results, returning what we have 15494 1726853373.17205: results queue empty 15494 1726853373.17206: checking for any_errors_fatal 15494 1726853373.17214: done checking for any_errors_fatal 15494 1726853373.17214: checking for max_fail_percentage 15494 1726853373.17216: done checking for max_fail_percentage 15494 1726853373.17217: checking to see if all hosts have failed and the running result is not ok 15494 1726853373.17217: done checking to see if all hosts have failed 15494 1726853373.17218: getting the remaining hosts for this loop 15494 1726853373.17225: done getting the remaining hosts for this loop 15494 1726853373.17228: getting the next task for host managed_node1 15494 1726853373.17238: done getting next task for host managed_node1 15494 1726853373.17241: ^ task is: TASK: meta (flush_handlers) 15494 1726853373.17243: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853373.17247: getting variables 15494 1726853373.17248: in VariableManager get_vars() 15494 1726853373.17278: Calling all_inventory to load vars for managed_node1 15494 1726853373.17280: Calling groups_inventory to load vars for managed_node1 15494 1726853373.17284: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853373.17295: Calling all_plugins_play to load vars for managed_node1 15494 1726853373.17431: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853373.17437: WORKER PROCESS EXITING 15494 1726853373.17441: Calling groups_plugins_play to load vars for managed_node1 15494 1726853373.19075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853373.20750: done with get_vars() 15494 1726853373.20839: done getting variables 15494 1726853373.20990: in VariableManager get_vars() 15494 1726853373.21001: Calling all_inventory to load vars for managed_node1 15494 1726853373.21003: Calling groups_inventory to load vars for managed_node1 15494 1726853373.21005: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853373.21017: Calling all_plugins_play to load vars for managed_node1 15494 1726853373.21021: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853373.21030: Calling groups_plugins_play to load vars for managed_node1 15494 1726853373.22461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853373.24144: done with get_vars() 15494 1726853373.24176: done queuing things up, now waiting for results queue to drain 15494 1726853373.24178: results queue empty 15494 1726853373.24179: checking for any_errors_fatal 15494 1726853373.24183: done checking for any_errors_fatal 15494 1726853373.24184: checking for max_fail_percentage 15494 1726853373.24185: done checking for max_fail_percentage 15494 1726853373.24185: checking to see if all hosts have failed and the running result is not ok 15494 1726853373.24186: done checking to see if all hosts have failed 15494 1726853373.24187: getting the remaining hosts for this loop 15494 1726853373.24188: done getting the remaining hosts for this loop 15494 1726853373.24191: getting the next task for host managed_node1 15494 1726853373.24195: done getting next task for host managed_node1 15494 1726853373.24196: ^ task is: TASK: meta (flush_handlers) 15494 1726853373.24198: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853373.24200: getting variables 15494 1726853373.24201: in VariableManager get_vars() 15494 1726853373.24209: Calling all_inventory to load vars for managed_node1 15494 1726853373.24211: Calling groups_inventory to load vars for managed_node1 15494 1726853373.24213: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853373.24219: Calling all_plugins_play to load vars for managed_node1 15494 1726853373.24221: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853373.24223: Calling groups_plugins_play to load vars for managed_node1 15494 1726853373.25601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853373.27433: done with get_vars() 15494 1726853373.27469: done getting variables 15494 1726853373.27559: in VariableManager get_vars() 15494 1726853373.27578: Calling all_inventory to load vars for managed_node1 15494 1726853373.27580: Calling groups_inventory to load vars for managed_node1 15494 1726853373.27583: Calling all_plugins_inventory to load vars for managed_node1 15494 1726853373.27588: Calling all_plugins_play to load vars for managed_node1 15494 1726853373.27590: Calling groups_plugins_inventory to load vars for managed_node1 15494 1726853373.27593: Calling groups_plugins_play to load vars for managed_node1 15494 1726853373.28935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15494 1726853373.30712: done with get_vars() 15494 1726853373.30755: done queuing things up, now waiting for results queue to drain 15494 1726853373.30773: results queue empty 15494 1726853373.30774: checking for any_errors_fatal 15494 1726853373.30775: done checking for any_errors_fatal 15494 1726853373.30776: checking for max_fail_percentage 15494 1726853373.30777: done checking for max_fail_percentage 15494 1726853373.30778: checking to see if all hosts have failed and the running result is not ok 15494 1726853373.30779: done checking to see if all hosts have failed 15494 1726853373.30779: getting the remaining hosts for this loop 15494 1726853373.30780: done getting the remaining hosts for this loop 15494 1726853373.30783: getting the next task for host managed_node1 15494 1726853373.30787: done getting next task for host managed_node1 15494 1726853373.30788: ^ task is: None 15494 1726853373.30789: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15494 1726853373.30791: done queuing things up, now waiting for results queue to drain 15494 1726853373.30791: results queue empty 15494 1726853373.30792: checking for any_errors_fatal 15494 1726853373.30793: done checking for any_errors_fatal 15494 1726853373.30794: checking for max_fail_percentage 15494 1726853373.30794: done checking for max_fail_percentage 15494 1726853373.30795: checking to see if all hosts have failed and the running result is not ok 15494 1726853373.30796: done checking to see if all hosts have failed 15494 1726853373.30797: getting the next task for host managed_node1 15494 1726853373.30799: done getting next task for host managed_node1 15494 1726853373.30800: ^ task is: None 15494 1726853373.30801: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=82 changed=3 unreachable=0 failed=0 skipped=71 rescued=0 ignored=2 Friday 20 September 2024 13:29:33 -0400 (0:00:00.668) 0:00:41.924 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.00s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 2.00s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 1.93s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.83s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.63s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 Gathering Facts --------------------------------------------------------- 1.06s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 fedora.linux_system_roles.network : Check which packages are installed --- 1.06s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.04s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 1.01s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 0.98s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Gathering Facts --------------------------------------------------------- 0.97s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Gathering Facts --------------------------------------------------------- 0.93s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 0.92s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.90s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.88s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gathering Facts --------------------------------------------------------- 0.87s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 fedora.linux_system_roles.network : Check which packages are installed --- 0.84s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.79s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.76s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 15494 1726853373.30915: RUNNING CLEANUP