[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 15247 1726867229.49336: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Isn executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 15247 1726867229.49806: Added group all to inventory 15247 1726867229.49808: Added group ungrouped to inventory 15247 1726867229.49812: Group all now contains ungrouped 15247 1726867229.49815: Examining possible inventory source: /tmp/network-5rw/inventory.yml 15247 1726867229.78498: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 15247 1726867229.78678: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 15247 1726867229.78701: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 15247 1726867229.78880: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 15247 1726867229.79071: Loaded config def from plugin (inventory/script) 15247 1726867229.79073: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 15247 1726867229.79119: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 15247 1726867229.79328: Loaded config def from plugin (inventory/yaml) 15247 1726867229.79330: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 15247 1726867229.79516: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 15247 1726867229.80524: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 15247 1726867229.80527: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 15247 1726867229.80530: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 15247 1726867229.80536: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 15247 1726867229.80541: Loading data from /tmp/network-5rw/inventory.yml 15247 1726867229.80729: /tmp/network-5rw/inventory.yml was not parsable by auto 15247 1726867229.80893: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 15247 1726867229.81017: Loading data from /tmp/network-5rw/inventory.yml 15247 1726867229.81106: group all already in inventory 15247 1726867229.81230: set inventory_file for managed_node1 15247 1726867229.81235: set inventory_dir for managed_node1 15247 1726867229.81236: Added host managed_node1 to inventory 15247 1726867229.81238: Added host managed_node1 to group all 15247 1726867229.81240: set ansible_host for managed_node1 15247 1726867229.81240: set ansible_ssh_extra_args for managed_node1 15247 1726867229.81244: set inventory_file for managed_node2 15247 1726867229.81247: set inventory_dir for managed_node2 15247 1726867229.81249: Added host managed_node2 to inventory 15247 1726867229.81251: Added host managed_node2 to group all 15247 1726867229.81252: set ansible_host for managed_node2 15247 1726867229.81252: set ansible_ssh_extra_args for managed_node2 15247 1726867229.81255: set inventory_file for managed_node3 15247 1726867229.81257: set inventory_dir for managed_node3 15247 1726867229.81258: Added host managed_node3 to inventory 15247 1726867229.81259: Added host managed_node3 to group all 15247 1726867229.81260: set ansible_host for managed_node3 15247 1726867229.81261: set ansible_ssh_extra_args for managed_node3 15247 1726867229.81268: Reconcile groups and hosts in inventory. 15247 1726867229.81273: Group ungrouped now contains managed_node1 15247 1726867229.81275: Group ungrouped now contains managed_node2 15247 1726867229.81276: Group ungrouped now contains managed_node3 15247 1726867229.81472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 15247 1726867229.81723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 15247 1726867229.81889: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 15247 1726867229.81922: Loaded config def from plugin (vars/host_group_vars) 15247 1726867229.81925: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 15247 1726867229.81932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 15247 1726867229.81940: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 15247 1726867229.82108: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 15247 1726867229.82800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867229.83015: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 15247 1726867229.83054: Loaded config def from plugin (connection/local) 15247 1726867229.83057: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 15247 1726867229.84617: Loaded config def from plugin (connection/paramiko_ssh) 15247 1726867229.84620: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 15247 1726867229.86700: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15247 1726867229.86818: Loaded config def from plugin (connection/psrp) 15247 1726867229.86821: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 15247 1726867229.88397: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15247 1726867229.88434: Loaded config def from plugin (connection/ssh) 15247 1726867229.88437: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 15247 1726867229.92960: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 15247 1726867229.93007: Loaded config def from plugin (connection/winrm) 15247 1726867229.93017: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 15247 1726867229.93049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 15247 1726867229.93118: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 15247 1726867229.93199: Loaded config def from plugin (shell/cmd) 15247 1726867229.93201: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 15247 1726867229.93234: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 15247 1726867229.93301: Loaded config def from plugin (shell/powershell) 15247 1726867229.93303: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 15247 1726867229.93366: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 15247 1726867229.93566: Loaded config def from plugin (shell/sh) 15247 1726867229.93568: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 15247 1726867229.93602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 15247 1726867229.93730: Loaded config def from plugin (become/runas) 15247 1726867229.93733: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 15247 1726867229.93939: Loaded config def from plugin (become/su) 15247 1726867229.93942: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 15247 1726867229.94123: Loaded config def from plugin (become/sudo) 15247 1726867229.94126: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 15247 1726867229.94158: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 15247 1726867229.94591: in VariableManager get_vars() 15247 1726867229.94618: done with get_vars() 15247 1726867229.94762: trying /usr/local/lib/python3.12/site-packages/ansible/modules 15247 1726867229.99626: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 15247 1726867230.00026: in VariableManager get_vars() 15247 1726867230.00032: done with get_vars() 15247 1726867230.00034: variable 'playbook_dir' from source: magic vars 15247 1726867230.00035: variable 'ansible_playbook_python' from source: magic vars 15247 1726867230.00036: variable 'ansible_config_file' from source: magic vars 15247 1726867230.00036: variable 'groups' from source: magic vars 15247 1726867230.00037: variable 'omit' from source: magic vars 15247 1726867230.00044: variable 'ansible_version' from source: magic vars 15247 1726867230.00045: variable 'ansible_check_mode' from source: magic vars 15247 1726867230.00046: variable 'ansible_diff_mode' from source: magic vars 15247 1726867230.00047: variable 'ansible_forks' from source: magic vars 15247 1726867230.00047: variable 'ansible_inventory_sources' from source: magic vars 15247 1726867230.00049: variable 'ansible_skip_tags' from source: magic vars 15247 1726867230.00049: variable 'ansible_limit' from source: magic vars 15247 1726867230.00050: variable 'ansible_run_tags' from source: magic vars 15247 1726867230.00051: variable 'ansible_verbosity' from source: magic vars 15247 1726867230.00086: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml 15247 1726867230.00611: in VariableManager get_vars() 15247 1726867230.00628: done with get_vars() 15247 1726867230.00668: in VariableManager get_vars() 15247 1726867230.00681: done with get_vars() 15247 1726867230.00716: in VariableManager get_vars() 15247 1726867230.00727: done with get_vars() 15247 1726867230.00801: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15247 1726867230.01040: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15247 1726867230.01182: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15247 1726867230.01897: in VariableManager get_vars() 15247 1726867230.01921: done with get_vars() 15247 1726867230.03003: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 15247 1726867230.03339: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15247 1726867230.05625: in VariableManager get_vars() 15247 1726867230.05629: done with get_vars() 15247 1726867230.05632: variable 'playbook_dir' from source: magic vars 15247 1726867230.05633: variable 'ansible_playbook_python' from source: magic vars 15247 1726867230.05633: variable 'ansible_config_file' from source: magic vars 15247 1726867230.05634: variable 'groups' from source: magic vars 15247 1726867230.05635: variable 'omit' from source: magic vars 15247 1726867230.05635: variable 'ansible_version' from source: magic vars 15247 1726867230.05636: variable 'ansible_check_mode' from source: magic vars 15247 1726867230.05637: variable 'ansible_diff_mode' from source: magic vars 15247 1726867230.05638: variable 'ansible_forks' from source: magic vars 15247 1726867230.05638: variable 'ansible_inventory_sources' from source: magic vars 15247 1726867230.05639: variable 'ansible_skip_tags' from source: magic vars 15247 1726867230.05640: variable 'ansible_limit' from source: magic vars 15247 1726867230.05640: variable 'ansible_run_tags' from source: magic vars 15247 1726867230.05641: variable 'ansible_verbosity' from source: magic vars 15247 1726867230.05681: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15247 1726867230.05799: in VariableManager get_vars() 15247 1726867230.05813: done with get_vars() 15247 1726867230.05861: in VariableManager get_vars() 15247 1726867230.05864: done with get_vars() 15247 1726867230.05867: variable 'playbook_dir' from source: magic vars 15247 1726867230.05868: variable 'ansible_playbook_python' from source: magic vars 15247 1726867230.05869: variable 'ansible_config_file' from source: magic vars 15247 1726867230.05870: variable 'groups' from source: magic vars 15247 1726867230.05870: variable 'omit' from source: magic vars 15247 1726867230.05873: variable 'ansible_version' from source: magic vars 15247 1726867230.05874: variable 'ansible_check_mode' from source: magic vars 15247 1726867230.05875: variable 'ansible_diff_mode' from source: magic vars 15247 1726867230.05909: variable 'ansible_forks' from source: magic vars 15247 1726867230.05974: variable 'ansible_inventory_sources' from source: magic vars 15247 1726867230.05976: variable 'ansible_skip_tags' from source: magic vars 15247 1726867230.05978: variable 'ansible_limit' from source: magic vars 15247 1726867230.05979: variable 'ansible_run_tags' from source: magic vars 15247 1726867230.05980: variable 'ansible_verbosity' from source: magic vars 15247 1726867230.06047: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15247 1726867230.06474: in VariableManager get_vars() 15247 1726867230.06489: done with get_vars() 15247 1726867230.06717: in VariableManager get_vars() 15247 1726867230.06788: done with get_vars() 15247 1726867230.06816: variable 'playbook_dir' from source: magic vars 15247 1726867230.06817: variable 'ansible_playbook_python' from source: magic vars 15247 1726867230.06818: variable 'ansible_config_file' from source: magic vars 15247 1726867230.06819: variable 'groups' from source: magic vars 15247 1726867230.06820: variable 'omit' from source: magic vars 15247 1726867230.06825: variable 'ansible_version' from source: magic vars 15247 1726867230.06850: variable 'ansible_check_mode' from source: magic vars 15247 1726867230.06851: variable 'ansible_diff_mode' from source: magic vars 15247 1726867230.06852: variable 'ansible_forks' from source: magic vars 15247 1726867230.06857: variable 'ansible_inventory_sources' from source: magic vars 15247 1726867230.06857: variable 'ansible_skip_tags' from source: magic vars 15247 1726867230.06858: variable 'ansible_limit' from source: magic vars 15247 1726867230.06859: variable 'ansible_run_tags' from source: magic vars 15247 1726867230.06860: variable 'ansible_verbosity' from source: magic vars 15247 1726867230.06891: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 15247 1726867230.06980: in VariableManager get_vars() 15247 1726867230.06983: done with get_vars() 15247 1726867230.06985: variable 'playbook_dir' from source: magic vars 15247 1726867230.06986: variable 'ansible_playbook_python' from source: magic vars 15247 1726867230.06987: variable 'ansible_config_file' from source: magic vars 15247 1726867230.06987: variable 'groups' from source: magic vars 15247 1726867230.06988: variable 'omit' from source: magic vars 15247 1726867230.06989: variable 'ansible_version' from source: magic vars 15247 1726867230.06990: variable 'ansible_check_mode' from source: magic vars 15247 1726867230.06990: variable 'ansible_diff_mode' from source: magic vars 15247 1726867230.06991: variable 'ansible_forks' from source: magic vars 15247 1726867230.06992: variable 'ansible_inventory_sources' from source: magic vars 15247 1726867230.06993: variable 'ansible_skip_tags' from source: magic vars 15247 1726867230.06993: variable 'ansible_limit' from source: magic vars 15247 1726867230.06994: variable 'ansible_run_tags' from source: magic vars 15247 1726867230.06995: variable 'ansible_verbosity' from source: magic vars 15247 1726867230.07022: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 15247 1726867230.07140: in VariableManager get_vars() 15247 1726867230.07156: done with get_vars() 15247 1726867230.07375: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15247 1726867230.07520: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15247 1726867230.08223: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15247 1726867230.09333: in VariableManager get_vars() 15247 1726867230.09355: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15247 1726867230.11981: in VariableManager get_vars() 15247 1726867230.11993: done with get_vars() 15247 1726867230.12030: in VariableManager get_vars() 15247 1726867230.12032: done with get_vars() 15247 1726867230.12034: variable 'playbook_dir' from source: magic vars 15247 1726867230.12035: variable 'ansible_playbook_python' from source: magic vars 15247 1726867230.12036: variable 'ansible_config_file' from source: magic vars 15247 1726867230.12037: variable 'groups' from source: magic vars 15247 1726867230.12037: variable 'omit' from source: magic vars 15247 1726867230.12038: variable 'ansible_version' from source: magic vars 15247 1726867230.12039: variable 'ansible_check_mode' from source: magic vars 15247 1726867230.12039: variable 'ansible_diff_mode' from source: magic vars 15247 1726867230.12040: variable 'ansible_forks' from source: magic vars 15247 1726867230.12041: variable 'ansible_inventory_sources' from source: magic vars 15247 1726867230.12042: variable 'ansible_skip_tags' from source: magic vars 15247 1726867230.12042: variable 'ansible_limit' from source: magic vars 15247 1726867230.12043: variable 'ansible_run_tags' from source: magic vars 15247 1726867230.12044: variable 'ansible_verbosity' from source: magic vars 15247 1726867230.12084: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 15247 1726867230.12152: in VariableManager get_vars() 15247 1726867230.12172: done with get_vars() 15247 1726867230.12214: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 15247 1726867230.12439: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 15247 1726867230.12552: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 15247 1726867230.17381: in VariableManager get_vars() 15247 1726867230.17418: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15247 1726867230.21155: in VariableManager get_vars() 15247 1726867230.21158: done with get_vars() 15247 1726867230.21160: variable 'playbook_dir' from source: magic vars 15247 1726867230.21161: variable 'ansible_playbook_python' from source: magic vars 15247 1726867230.21162: variable 'ansible_config_file' from source: magic vars 15247 1726867230.21163: variable 'groups' from source: magic vars 15247 1726867230.21163: variable 'omit' from source: magic vars 15247 1726867230.21164: variable 'ansible_version' from source: magic vars 15247 1726867230.21165: variable 'ansible_check_mode' from source: magic vars 15247 1726867230.21165: variable 'ansible_diff_mode' from source: magic vars 15247 1726867230.21166: variable 'ansible_forks' from source: magic vars 15247 1726867230.21167: variable 'ansible_inventory_sources' from source: magic vars 15247 1726867230.21168: variable 'ansible_skip_tags' from source: magic vars 15247 1726867230.21168: variable 'ansible_limit' from source: magic vars 15247 1726867230.21169: variable 'ansible_run_tags' from source: magic vars 15247 1726867230.21170: variable 'ansible_verbosity' from source: magic vars 15247 1726867230.21396: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15247 1726867230.21464: in VariableManager get_vars() 15247 1726867230.21491: done with get_vars() 15247 1726867230.21898: in VariableManager get_vars() 15247 1726867230.21901: done with get_vars() 15247 1726867230.21905: variable 'playbook_dir' from source: magic vars 15247 1726867230.21906: variable 'ansible_playbook_python' from source: magic vars 15247 1726867230.21907: variable 'ansible_config_file' from source: magic vars 15247 1726867230.21908: variable 'groups' from source: magic vars 15247 1726867230.21908: variable 'omit' from source: magic vars 15247 1726867230.21909: variable 'ansible_version' from source: magic vars 15247 1726867230.21910: variable 'ansible_check_mode' from source: magic vars 15247 1726867230.21911: variable 'ansible_diff_mode' from source: magic vars 15247 1726867230.21911: variable 'ansible_forks' from source: magic vars 15247 1726867230.21912: variable 'ansible_inventory_sources' from source: magic vars 15247 1726867230.21913: variable 'ansible_skip_tags' from source: magic vars 15247 1726867230.21914: variable 'ansible_limit' from source: magic vars 15247 1726867230.21914: variable 'ansible_run_tags' from source: magic vars 15247 1726867230.21915: variable 'ansible_verbosity' from source: magic vars 15247 1726867230.21953: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 15247 1726867230.22069: in VariableManager get_vars() 15247 1726867230.22127: done with get_vars() 15247 1726867230.22355: in VariableManager get_vars() 15247 1726867230.22367: done with get_vars() 15247 1726867230.22566: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 15247 1726867230.22584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 15247 1726867230.23444: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 15247 1726867230.24045: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 15247 1726867230.24059: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 15247 1726867230.24121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 15247 1726867230.24150: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 15247 1726867230.24767: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 15247 1726867230.25159: Loaded config def from plugin (callback/default) 15247 1726867230.25162: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15247 1726867230.28623: Loaded config def from plugin (callback/junit) 15247 1726867230.28626: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15247 1726867230.28672: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 15247 1726867230.28857: Loaded config def from plugin (callback/minimal) 15247 1726867230.28860: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15247 1726867230.28930: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 15247 1726867230.28993: Loaded config def from plugin (callback/tree) 15247 1726867230.28995: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 15247 1726867230.29328: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 15247 1726867230.29331: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bridge_nm.yml ************************************************** 11 plays in /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 15247 1726867230.29360: in VariableManager get_vars() 15247 1726867230.29376: done with get_vars() 15247 1726867230.29385: in VariableManager get_vars() 15247 1726867230.29489: done with get_vars() 15247 1726867230.29496: variable 'omit' from source: magic vars 15247 1726867230.29640: in VariableManager get_vars() 15247 1726867230.29655: done with get_vars() 15247 1726867230.29676: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bridge.yml' with nm as provider] *********** 15247 1726867230.30990: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 15247 1726867230.31085: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 15247 1726867230.31243: getting the remaining hosts for this loop 15247 1726867230.31245: done getting the remaining hosts for this loop 15247 1726867230.31249: getting the next task for host managed_node2 15247 1726867230.31252: done getting next task for host managed_node2 15247 1726867230.31254: ^ task is: TASK: Gathering Facts 15247 1726867230.31256: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867230.31258: getting variables 15247 1726867230.31259: in VariableManager get_vars() 15247 1726867230.31271: Calling all_inventory to load vars for managed_node2 15247 1726867230.31274: Calling groups_inventory to load vars for managed_node2 15247 1726867230.31276: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867230.31288: Calling all_plugins_play to load vars for managed_node2 15247 1726867230.31298: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867230.31301: Calling groups_plugins_play to load vars for managed_node2 15247 1726867230.31333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867230.31414: done with get_vars() 15247 1726867230.31421: done getting variables 15247 1726867230.31513: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 Friday 20 September 2024 17:20:30 -0400 (0:00:00.024) 0:00:00.024 ****** 15247 1726867230.31533: entering _queue_task() for managed_node2/gather_facts 15247 1726867230.31535: Creating lock for gather_facts 15247 1726867230.31873: worker is 1 (out of 1 available) 15247 1726867230.32030: exiting _queue_task() for managed_node2/gather_facts 15247 1726867230.32041: done queuing things up, now waiting for results queue to drain 15247 1726867230.32042: waiting for pending results... 15247 1726867230.32152: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15247 1726867230.32263: in run() - task 0affcac9-a3a5-8ce3-1923-00000000007e 15247 1726867230.32289: variable 'ansible_search_path' from source: unknown 15247 1726867230.32331: calling self._execute() 15247 1726867230.32398: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867230.32412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867230.32461: variable 'omit' from source: magic vars 15247 1726867230.32581: variable 'omit' from source: magic vars 15247 1726867230.32683: variable 'omit' from source: magic vars 15247 1726867230.32686: variable 'omit' from source: magic vars 15247 1726867230.32761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867230.33086: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867230.33090: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867230.33092: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867230.33094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867230.33153: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867230.33210: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867230.33519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867230.33783: Set connection var ansible_shell_executable to /bin/sh 15247 1726867230.33786: Set connection var ansible_connection to ssh 15247 1726867230.33788: Set connection var ansible_shell_type to sh 15247 1726867230.33790: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867230.33792: Set connection var ansible_timeout to 10 15247 1726867230.33795: Set connection var ansible_pipelining to False 15247 1726867230.34173: variable 'ansible_shell_executable' from source: unknown 15247 1726867230.34176: variable 'ansible_connection' from source: unknown 15247 1726867230.34181: variable 'ansible_module_compression' from source: unknown 15247 1726867230.34183: variable 'ansible_shell_type' from source: unknown 15247 1726867230.34185: variable 'ansible_shell_executable' from source: unknown 15247 1726867230.34187: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867230.34189: variable 'ansible_pipelining' from source: unknown 15247 1726867230.34191: variable 'ansible_timeout' from source: unknown 15247 1726867230.34194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867230.34529: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867230.34620: variable 'omit' from source: magic vars 15247 1726867230.34660: starting attempt loop 15247 1726867230.34683: running the handler 15247 1726867230.34779: variable 'ansible_facts' from source: unknown 15247 1726867230.34846: _low_level_execute_command(): starting 15247 1726867230.34859: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867230.35976: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867230.36017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867230.36137: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867230.36140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867230.36143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867230.36145: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867230.36245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867230.38082: stdout chunk (state=3): >>>/root <<< 15247 1726867230.38584: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867230.38588: stdout chunk (state=3): >>><<< 15247 1726867230.38590: stderr chunk (state=3): >>><<< 15247 1726867230.38593: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867230.38596: _low_level_execute_command(): starting 15247 1726867230.38599: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867230.3836334-15284-235496990554148 `" && echo ansible-tmp-1726867230.3836334-15284-235496990554148="` echo /root/.ansible/tmp/ansible-tmp-1726867230.3836334-15284-235496990554148 `" ) && sleep 0' 15247 1726867230.39573: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867230.39590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867230.39606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867230.39626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867230.39659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867230.39692: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867230.39840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867230.39964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867230.40116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867230.42105: stdout chunk (state=3): >>>ansible-tmp-1726867230.3836334-15284-235496990554148=/root/.ansible/tmp/ansible-tmp-1726867230.3836334-15284-235496990554148 <<< 15247 1726867230.42269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867230.42273: stdout chunk (state=3): >>><<< 15247 1726867230.42276: stderr chunk (state=3): >>><<< 15247 1726867230.42485: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867230.3836334-15284-235496990554148=/root/.ansible/tmp/ansible-tmp-1726867230.3836334-15284-235496990554148 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867230.42488: variable 'ansible_module_compression' from source: unknown 15247 1726867230.42599: ANSIBALLZ: Using generic lock for ansible.legacy.setup 15247 1726867230.42610: ANSIBALLZ: Acquiring lock 15247 1726867230.42618: ANSIBALLZ: Lock acquired: 140393880930304 15247 1726867230.42625: ANSIBALLZ: Creating module 15247 1726867231.06624: ANSIBALLZ: Writing module into payload 15247 1726867231.06872: ANSIBALLZ: Writing module 15247 1726867231.07013: ANSIBALLZ: Renaming module 15247 1726867231.07026: ANSIBALLZ: Done creating module 15247 1726867231.07069: variable 'ansible_facts' from source: unknown 15247 1726867231.07128: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867231.07170: _low_level_execute_command(): starting 15247 1726867231.07184: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 15247 1726867231.07861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867231.07969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867231.07990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867231.08195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867231.08353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867231.10211: stdout chunk (state=3): >>>PLATFORM Linux <<< 15247 1726867231.10214: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 15247 1726867231.10217: stdout chunk (state=3): >>>/usr/bin/python3 <<< 15247 1726867231.10219: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 15247 1726867231.10642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867231.10646: stdout chunk (state=3): >>><<< 15247 1726867231.10648: stderr chunk (state=3): >>><<< 15247 1726867231.10651: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867231.10655 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 15247 1726867231.10658: _low_level_execute_command(): starting 15247 1726867231.10660: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 15247 1726867231.11081: Sending initial data 15247 1726867231.11085: Sent initial data (1181 bytes) 15247 1726867231.12017: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867231.12030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867231.12048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867231.12168: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867231.12278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867231.12498: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867231.15832: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 15247 1726867231.16305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867231.16316: stdout chunk (state=3): >>><<< 15247 1726867231.16327: stderr chunk (state=3): >>><<< 15247 1726867231.16411: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867231.17069: variable 'ansible_facts' from source: unknown 15247 1726867231.17072: variable 'ansible_facts' from source: unknown 15247 1726867231.17075: variable 'ansible_module_compression' from source: unknown 15247 1726867231.17079: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15247 1726867231.17082: variable 'ansible_facts' from source: unknown 15247 1726867231.17809: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867230.3836334-15284-235496990554148/AnsiballZ_setup.py 15247 1726867231.18512: Sending initial data 15247 1726867231.18516: Sent initial data (154 bytes) 15247 1726867231.19495: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867231.21125: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 15247 1726867231.21150: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867231.21166: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867231.21654: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpncz0t3az /root/.ansible/tmp/ansible-tmp-1726867230.3836334-15284-235496990554148/AnsiballZ_setup.py <<< 15247 1726867231.21665: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867230.3836334-15284-235496990554148/AnsiballZ_setup.py" <<< 15247 1726867231.21899: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpncz0t3az" to remote "/root/.ansible/tmp/ansible-tmp-1726867230.3836334-15284-235496990554148/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867230.3836334-15284-235496990554148/AnsiballZ_setup.py" <<< 15247 1726867231.25137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867231.25417: stderr chunk (state=3): >>><<< 15247 1726867231.25427: stdout chunk (state=3): >>><<< 15247 1726867231.25456: done transferring module to remote 15247 1726867231.25672: _low_level_execute_command(): starting 15247 1726867231.25676: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867230.3836334-15284-235496990554148/ /root/.ansible/tmp/ansible-tmp-1726867230.3836334-15284-235496990554148/AnsiballZ_setup.py && sleep 0' 15247 1726867231.26931: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867231.26946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867231.27175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867231.27226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867231.27245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867231.27260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867231.27508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867231.30044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867231.30048: stdout chunk (state=3): >>><<< 15247 1726867231.30051: stderr chunk (state=3): >>><<< 15247 1726867231.30067: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867231.30076: _low_level_execute_command(): starting 15247 1726867231.30088: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867230.3836334-15284-235496990554148/AnsiballZ_setup.py && sleep 0' 15247 1726867231.31995: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867231.32067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867231.32143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867231.34314: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15247 1726867231.34347: stdout chunk (state=3): >>>import _imp # builtin <<< 15247 1726867231.34376: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 15247 1726867231.34472: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 15247 1726867231.34489: stdout chunk (state=3): >>>import 'posix' # <<< 15247 1726867231.34519: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 15247 1726867231.34582: stdout chunk (state=3): >>># installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 15247 1726867231.34612: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867231.34782: stdout chunk (state=3): >>>import '_codecs' # <<< 15247 1726867231.34786: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 15247 1726867231.34791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 15247 1726867231.34796: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b7604d0> <<< 15247 1726867231.34900: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b72fb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b762a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 15247 1726867231.34955: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15247 1726867231.34989: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 15247 1726867231.35097: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 15247 1726867231.35133: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 15247 1726867231.35145: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b771130> <<< 15247 1726867231.35230: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 15247 1726867231.35234: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b771fa0> <<< 15247 1726867231.35302: stdout chunk (state=3): >>>import 'site' # <<< 15247 1726867231.35309: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15247 1726867231.35673: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15247 1726867231.35768: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 15247 1726867231.35789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15247 1726867231.35804: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 15247 1726867231.35839: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b54fdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15247 1726867231.35931: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b54ffe0> <<< 15247 1726867231.35939: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 15247 1726867231.35955: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15247 1726867231.36019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867231.36053: stdout chunk (state=3): >>>import 'itertools' # <<< 15247 1726867231.36092: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b587800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 15247 1726867231.36197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b587e90> import '_collections' # <<< 15247 1726867231.36218: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b567aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5651c0> <<< 15247 1726867231.36281: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b54cf80> <<< 15247 1726867231.36315: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15247 1726867231.36339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 15247 1726867231.36367: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 15247 1726867231.36392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15247 1726867231.36426: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 15247 1726867231.36574: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5a76e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5a6300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b566060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b54ee70> <<< 15247 1726867231.36579: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 15247 1726867231.36582: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5dc7a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b54c200> <<< 15247 1726867231.36608: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 15247 1726867231.36689: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 15247 1726867231.36702: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b5dcc50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5dcb00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b5dcef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b54ad20> <<< 15247 1726867231.36914: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5dd5b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5dd280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5de4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 15247 1726867231.36948: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 15247 1726867231.36974: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5f4680> <<< 15247 1726867231.37040: stdout chunk (state=3): >>>import 'errno' # <<< 15247 1726867231.37043: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b5f5d30> <<< 15247 1726867231.37065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 15247 1726867231.37127: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 15247 1726867231.37232: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5f6bd0> <<< 15247 1726867231.37282: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b5f7230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5f6120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b5f7cb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5f73e0> <<< 15247 1726867231.37311: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5de450> <<< 15247 1726867231.37325: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 15247 1726867231.37463: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 15247 1726867231.37470: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 15247 1726867231.37512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b2ebbc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b314710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b314470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b3146b0> <<< 15247 1726867231.37560: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 15247 1726867231.37563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15247 1726867231.37631: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867231.37785: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b314fe0> <<< 15247 1726867231.37892: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b315910> <<< 15247 1726867231.38040: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b314890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b2e9d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b316cc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b315790> <<< 15247 1726867231.38043: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5deba0> <<< 15247 1726867231.38070: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15247 1726867231.38144: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867231.38301: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15247 1726867231.38685: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b343020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b3633e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b3c4200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15247 1726867231.38707: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b3c6960> <<< 15247 1726867231.38789: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b3c4320> <<< 15247 1726867231.38824: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b3911f0> <<< 15247 1726867231.38865: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ad0d2e0> <<< 15247 1726867231.38967: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b3621e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b317bf0> <<< 15247 1726867231.39073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15247 1726867231.39186: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe12b362300> <<< 15247 1726867231.39368: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_4p64son8/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 15247 1726867231.39506: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.39538: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 15247 1726867231.39631: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15247 1726867231.39660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15247 1726867231.39948: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ad72f90> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ad51e80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ad51040> # zipimport: zlib available import 'ansible' # # zipimport: zlib available <<< 15247 1726867231.39972: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.40038: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 15247 1726867231.40041: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.41524: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.42570: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ad70e60> <<< 15247 1726867231.42683: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867231.42691: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 15247 1726867231.42735: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ada6930> <<< 15247 1726867231.42808: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ada6720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ada6030> <<< 15247 1726867231.42812: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 15247 1726867231.42819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 15247 1726867231.42889: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ada6a50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ad73c20> import 'atexit' # <<< 15247 1726867231.43110: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ada7680> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ada78c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ada7e00><<< 15247 1726867231.43114: stdout chunk (state=3): >>> import 'pwd' # <<< 15247 1726867231.43116: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15247 1726867231.43266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac11c40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ac13380> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15247 1726867231.43299: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac141a0> <<< 15247 1726867231.43302: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15247 1726867231.43336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 15247 1726867231.43405: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac15340> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 15247 1726867231.43414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 15247 1726867231.43566: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15247 1726867231.43615: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac17e00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ad53080> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac160c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 15247 1726867231.43621: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 15247 1726867231.43655: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15247 1726867231.43947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac1fcb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac1e780> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac1e510> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15247 1726867231.44611: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac1ea50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac165d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ac63e90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac63f50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ac65af0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac658b0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ac67fe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac661b0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867231.44615: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac6b860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac68230> <<< 15247 1726867231.44805: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ac6c620> <<< 15247 1726867231.44809: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ac6c8c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ac6ca40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac64230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867231.44838: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12aafc230> <<< 15247 1726867231.44998: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867231.45002: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12aafd2e0> <<< 15247 1726867231.45029: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac6e9c0> <<< 15247 1726867231.45055: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ac6fd70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac6e5d0> <<< 15247 1726867231.45084: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.45087: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 15247 1726867231.45291: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.45504: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.45619: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.46163: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.46800: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ab015b0> <<< 15247 1726867231.46887: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 15247 1726867231.46917: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ab02450> <<< 15247 1726867231.46920: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12aafd5b0> <<< 15247 1726867231.46991: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 15247 1726867231.47037: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 15247 1726867231.47174: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.47542: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ab02b40> # zipimport: zlib available <<< 15247 1726867231.47816: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.48264: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.48333: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.48499: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 15247 1726867231.48700: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 15247 1726867231.48744: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.48782: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 15247 1726867231.48793: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.49069: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.49250: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15247 1726867231.49313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 15247 1726867231.49325: stdout chunk (state=3): >>>import '_ast' # <<< 15247 1726867231.49610: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ab03620> # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.49613: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 15247 1726867231.49646: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.49956: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867231.50055: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ab0e150> <<< 15247 1726867231.50178: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ab098e0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 15247 1726867231.50206: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.50274: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.50305: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.50420: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 15247 1726867231.50423: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 15247 1726867231.50426: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 15247 1726867231.50596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 15247 1726867231.50599: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 15247 1726867231.50601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12abf2ba0> <<< 15247 1726867231.50724: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ace6870> <<< 15247 1726867231.50738: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ab0e390> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ab06630> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 15247 1726867231.50858: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15247 1726867231.50862: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 15247 1726867231.50897: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 15247 1726867231.50900: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.50960: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.51026: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.51074: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.51096: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.51115: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.51194: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.51310: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 15247 1726867231.51423: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.51645: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.51997: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 15247 1726867231.52215: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12aba23f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a720350> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867231.52218: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12a720680> <<< 15247 1726867231.52347: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ab887a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12aba2f00> <<< 15247 1726867231.52350: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12aba0aa0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12aba0680> <<< 15247 1726867231.52613: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12a723590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a722e40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12a723020> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a722270> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 15247 1726867231.52720: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a723710> <<< 15247 1726867231.52741: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 15247 1726867231.52774: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 15247 1726867231.52997: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12a7861e0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a784200> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12aba07a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.53484: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 15247 1726867231.53549: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.53613: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.53672: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.53939: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 15247 1726867231.54231: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.54769: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.54832: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.54867: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 15247 1726867231.54952: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 15247 1726867231.55008: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.55064: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 15247 1726867231.55126: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.55164: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 15247 1726867231.55189: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.55276: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 15247 1726867231.55336: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.55497: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 15247 1726867231.55500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a786420> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 15247 1726867231.55629: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a787080> <<< 15247 1726867231.55632: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 15247 1726867231.55635: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.55755: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # <<< 15247 1726867231.55767: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.55863: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.55954: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 15247 1726867231.55972: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.56074: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.56155: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 15247 1726867231.56158: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.56260: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15247 1726867231.56332: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867231.56405: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12a7be4b0> <<< 15247 1726867231.56596: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a7ac8c0> import 'ansible.module_utils.facts.system.python' # <<< 15247 1726867231.56611: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.56667: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.56723: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 15247 1726867231.56755: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.56854: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.56910: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.57194: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 15247 1726867231.57218: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.57260: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 15247 1726867231.57292: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.57316: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.57366: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 15247 1726867231.57435: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867231.57463: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12a7d2000> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a7d1f70> import 'ansible.module_utils.facts.system.user' # <<< 15247 1726867231.57516: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 15247 1726867231.57732: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.57746: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.57885: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 15247 1726867231.57905: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.58199: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 15247 1726867231.58226: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.58247: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.58425: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.58537: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 15247 1726867231.58552: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.58665: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.58839: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.58867: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.59526: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.60057: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 15247 1726867231.60073: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.60223: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 15247 1726867231.60271: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.60379: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 15247 1726867231.60706: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 15247 1726867231.60733: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 15247 1726867231.60809: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.60899: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 15247 1726867231.61141: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.61242: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.61436: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 15247 1726867231.61468: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.61490: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.61530: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 15247 1726867231.61548: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.61698: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 15247 1726867231.61701: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.61715: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.61740: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 15247 1726867231.61755: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.61787: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.61904: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 15247 1726867231.61910: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.61943: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 15247 1726867231.62006: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.62070: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 15247 1726867231.62086: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.62332: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.62615: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 15247 1726867231.62662: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.62724: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 15247 1726867231.62765: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.62908: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # <<< 15247 1726867231.62914: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.62935: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.62970: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 15247 1726867231.62974: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.63050: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.63128: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 15247 1726867231.63244: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 15247 1726867231.63311: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.63340: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 15247 1726867231.63343: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.63369: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.63386: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.63499: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.63597: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 15247 1726867231.63619: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.63794: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.63797: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 15247 1726867231.63899: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.64410: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867231.64475: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 15247 1726867231.64493: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.64628: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.64671: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 15247 1726867231.64753: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867231.65661: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12a5cf6e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a5cc080> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a5cefc0> <<< 15247 1726867231.79227: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a615580> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a616300> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867231.79255: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 15247 1726867231.79301: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a7c4a10> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a7c4500> <<< 15247 1726867231.79818: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 15247 1726867232.00127: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "20", "second": "31", "epoch": "1726867231", "epoch_int": "1726867231", "date": "2024-09-20", "time": "17:20:31", "iso8601_micro": "2024-09-20T21:20:31.661858Z", "iso8601": "2024-09-20T21:20:31Z", "iso8601_basic": "20240920T172031661858", "iso8601_basic_short": "20240920T172031", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.60205078125, "5m": 0.3740234375, "15m": 0.18310546875}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22<<< 15247 1726867232.00188: stdout chunk (state=3): >>>", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 469, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797208064, "block_size": 4096, "block_total": 65519099, "block_available": 63915334, "block_used": 1603765, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15247 1726867232.00887: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 15247 1726867232.00952: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ <<< 15247 1726867232.01038: stdout chunk (state=3): >>># clear sys.path # clear sys.argv<<< 15247 1726867232.01051: stdout chunk (state=3): >>> # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix<<< 15247 1726867232.01074: stdout chunk (state=3): >>> # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc<<< 15247 1726867232.01102: stdout chunk (state=3): >>> # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack<<< 15247 1726867232.01219: stdout chunk (state=3): >>> # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler <<< 15247 1726867232.01231: stdout chunk (state=3): >>># cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2<<< 15247 1726867232.01267: stdout chunk (state=3): >>> # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath<<< 15247 1726867232.01374: stdout chunk (state=3): >>> # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile <<< 15247 1726867232.01379: stdout chunk (state=3): >>># cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string<<< 15247 1726867232.01406: stdout chunk (state=3): >>> # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common<<< 15247 1726867232.01429: stdout chunk (state=3): >>> # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text<<< 15247 1726867232.01558: stdout chunk (state=3): >>> # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast <<< 15247 1726867232.01583: stdout chunk (state=3): >>># destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse<<< 15247 1726867232.01732: stdout chunk (state=3): >>> # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 15247 1726867232.02224: stdout chunk (state=3): >>># destroy _sitebuiltins<<< 15247 1726867232.02251: stdout chunk (state=3): >>> <<< 15247 1726867232.02293: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 15247 1726867232.02357: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression <<< 15247 1726867232.02624: stdout chunk (state=3): >>># destroy _lzma # destroy _blake2 <<< 15247 1726867232.02627: stdout chunk (state=3): >>># destroy binascii<<< 15247 1726867232.02630: stdout chunk (state=3): >>> # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json<<< 15247 1726867232.02661: stdout chunk (state=3): >>> # destroy grp # destroy encodings # destroy _locale <<< 15247 1726867232.02694: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal<<< 15247 1726867232.02721: stdout chunk (state=3): >>> # destroy _posixsubprocess # destroy syslog<<< 15247 1726867232.02781: stdout chunk (state=3): >>> # destroy uuid # destroy selinux <<< 15247 1726867232.02804: stdout chunk (state=3): >>># destroy shutil <<< 15247 1726867232.02837: stdout chunk (state=3): >>># destroy distro <<< 15247 1726867232.02958: stdout chunk (state=3): >>># destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle<<< 15247 1726867232.03000: stdout chunk (state=3): >>> # destroy _compat_pickle # destroy _pickle <<< 15247 1726867232.03025: stdout chunk (state=3): >>># destroy queue # destroy _heapq<<< 15247 1726867232.03049: stdout chunk (state=3): >>> # destroy _queue # destroy multiprocessing.reduction<<< 15247 1726867232.03106: stdout chunk (state=3): >>> # destroy selectors # destroy shlex # destroy fcntl # destroy datetime<<< 15247 1726867232.03132: stdout chunk (state=3): >>> # destroy subprocess # destroy base64<<< 15247 1726867232.03172: stdout chunk (state=3): >>> # destroy _ssl <<< 15247 1726867232.03461: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian<<< 15247 1726867232.03498: stdout chunk (state=3): >>> # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser<<< 15247 1726867232.03519: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal<<< 15247 1726867232.03555: stdout chunk (state=3): >>> # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize<<< 15247 1726867232.03606: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing <<< 15247 1726867232.03693: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct<<< 15247 1726867232.03697: stdout chunk (state=3): >>> # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg<<< 15247 1726867232.03710: stdout chunk (state=3): >>> # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath<<< 15247 1726867232.03796: stdout chunk (state=3): >>> # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat<<< 15247 1726867232.03823: stdout chunk (state=3): >>> # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins<<< 15247 1726867232.03835: stdout chunk (state=3): >>> # destroy selinux._selinux # destroy systemd._daemon<<< 15247 1726867232.03994: stdout chunk (state=3): >>> # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15247 1726867232.04110: stdout chunk (state=3): >>># destroy sys.monitoring <<< 15247 1726867232.04137: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 15247 1726867232.04247: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 15247 1726867232.04276: stdout chunk (state=3): >>># destroy tokenize <<< 15247 1726867232.04338: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 15247 1726867232.04368: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib # destroy _typing<<< 15247 1726867232.04516: stdout chunk (state=3): >>> # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external <<< 15247 1726867232.04520: stdout chunk (state=3): >>># destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules<<< 15247 1726867232.04523: stdout chunk (state=3): >>> # destroy _frozen_importlib<<< 15247 1726867232.04644: stdout chunk (state=3): >>> <<< 15247 1726867232.04701: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8<<< 15247 1726867232.04812: stdout chunk (state=3): >>> # destroy encodings.utf_8_sig # destroy encodings.cp437<<< 15247 1726867232.04844: stdout chunk (state=3): >>> # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib <<< 15247 1726867232.04883: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools<<< 15247 1726867232.04910: stdout chunk (state=3): >>> # destroy _abc # destroy posix # destroy _functools<<< 15247 1726867232.04953: stdout chunk (state=3): >>> # destroy builtins # destroy _thread # clear sys.audit hooks<<< 15247 1726867232.05202: stdout chunk (state=3): >>> <<< 15247 1726867232.05470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867232.05486: stderr chunk (state=3): >>>Shared connection to 10.31.12.116 closed. <<< 15247 1726867232.05544: stderr chunk (state=3): >>><<< 15247 1726867232.05559: stdout chunk (state=3): >>><<< 15247 1726867232.05869: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b7604d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b72fb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b762a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b771130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b771fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b54fdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b54ffe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b587800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b587e90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b567aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5651c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b54cf80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5a76e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5a6300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b566060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b54ee70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5dc7a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b54c200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b5dcc50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5dcb00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b5dcef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b54ad20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5dd5b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5dd280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5de4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5f4680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b5f5d30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5f6bd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b5f7230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5f6120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b5f7cb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5f73e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5de450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b2ebbc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b314710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b314470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b3146b0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b314fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12b315910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b314890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b2e9d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b316cc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b315790> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b5deba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b343020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b3633e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b3c4200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b3c6960> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b3c4320> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b3911f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ad0d2e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b3621e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12b317bf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe12b362300> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_4p64son8/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ad72f90> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ad51e80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ad51040> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ad70e60> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ada6930> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ada6720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ada6030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ada6a50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ad73c20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ada7680> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ada78c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ada7e00> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac11c40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ac13380> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac141a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac15340> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac17e00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ad53080> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac160c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac1fcb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac1e780> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac1e510> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac1ea50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac165d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ac63e90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac63f50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ac65af0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac658b0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ac67fe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac661b0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac6b860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac68230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ac6c620> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ac6c8c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ac6ca40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac64230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12aafc230> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12aafd2e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac6e9c0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ac6fd70> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ac6e5d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ab015b0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ab02450> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12aafd5b0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ab02b40> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ab03620> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12ab0e150> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ab098e0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12abf2ba0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ace6870> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ab0e390> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ab06630> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12aba23f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a720350> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12a720680> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12ab887a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12aba2f00> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12aba0aa0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12aba0680> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12a723590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a722e40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12a723020> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a722270> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a723710> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12a7861e0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a784200> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12aba07a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a786420> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a787080> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12a7be4b0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a7ac8c0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12a7d2000> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a7d1f70> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe12a5cf6e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a5cc080> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a5cefc0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a615580> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a616300> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a7c4a10> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe12a7c4500> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "20", "second": "31", "epoch": "1726867231", "epoch_int": "1726867231", "date": "2024-09-20", "time": "17:20:31", "iso8601_micro": "2024-09-20T21:20:31.661858Z", "iso8601": "2024-09-20T21:20:31Z", "iso8601_basic": "20240920T172031661858", "iso8601_basic_short": "20240920T172031", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.60205078125, "5m": 0.3740234375, "15m": 0.18310546875}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3297, "used": 234}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 469, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261797208064, "block_size": 4096, "block_total": 65519099, "block_available": 63915334, "block_used": 1603765, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 15247 1726867232.09998: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867230.3836334-15284-235496990554148/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867232.10028: _low_level_execute_command(): starting 15247 1726867232.10031: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867230.3836334-15284-235496990554148/ > /dev/null 2>&1 && sleep 0' 15247 1726867232.11651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867232.11655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867232.11717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867232.14671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867232.14674: stdout chunk (state=3): >>><<< 15247 1726867232.14680: stderr chunk (state=3): >>><<< 15247 1726867232.14682: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867232.14685: handler run complete 15247 1726867232.15002: variable 'ansible_facts' from source: unknown 15247 1726867232.15093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867232.16143: variable 'ansible_facts' from source: unknown 15247 1726867232.16438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867232.16568: attempt loop complete, returning result 15247 1726867232.16573: _execute() done 15247 1726867232.16575: dumping result to json 15247 1726867232.16732: done dumping result, returning 15247 1726867232.16735: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-8ce3-1923-00000000007e] 15247 1726867232.16803: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000007e 15247 1726867232.17627: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000007e 15247 1726867232.17633: WORKER PROCESS EXITING ok: [managed_node2] 15247 1726867232.18045: no more pending results, returning what we have 15247 1726867232.18049: results queue empty 15247 1726867232.18050: checking for any_errors_fatal 15247 1726867232.18051: done checking for any_errors_fatal 15247 1726867232.18052: checking for max_fail_percentage 15247 1726867232.18053: done checking for max_fail_percentage 15247 1726867232.18054: checking to see if all hosts have failed and the running result is not ok 15247 1726867232.18055: done checking to see if all hosts have failed 15247 1726867232.18056: getting the remaining hosts for this loop 15247 1726867232.18057: done getting the remaining hosts for this loop 15247 1726867232.18061: getting the next task for host managed_node2 15247 1726867232.18066: done getting next task for host managed_node2 15247 1726867232.18068: ^ task is: TASK: meta (flush_handlers) 15247 1726867232.18070: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867232.18074: getting variables 15247 1726867232.18075: in VariableManager get_vars() 15247 1726867232.18097: Calling all_inventory to load vars for managed_node2 15247 1726867232.18100: Calling groups_inventory to load vars for managed_node2 15247 1726867232.18104: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867232.18113: Calling all_plugins_play to load vars for managed_node2 15247 1726867232.18115: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867232.18118: Calling groups_plugins_play to load vars for managed_node2 15247 1726867232.18820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867232.19208: done with get_vars() 15247 1726867232.19219: done getting variables 15247 1726867232.19637: in VariableManager get_vars() 15247 1726867232.19647: Calling all_inventory to load vars for managed_node2 15247 1726867232.19650: Calling groups_inventory to load vars for managed_node2 15247 1726867232.19653: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867232.19659: Calling all_plugins_play to load vars for managed_node2 15247 1726867232.19662: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867232.19664: Calling groups_plugins_play to load vars for managed_node2 15247 1726867232.20105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867232.20562: done with get_vars() 15247 1726867232.20690: done queuing things up, now waiting for results queue to drain 15247 1726867232.20692: results queue empty 15247 1726867232.20693: checking for any_errors_fatal 15247 1726867232.20696: done checking for any_errors_fatal 15247 1726867232.20696: checking for max_fail_percentage 15247 1726867232.20697: done checking for max_fail_percentage 15247 1726867232.20698: checking to see if all hosts have failed and the running result is not ok 15247 1726867232.20699: done checking to see if all hosts have failed 15247 1726867232.20699: getting the remaining hosts for this loop 15247 1726867232.20704: done getting the remaining hosts for this loop 15247 1726867232.20707: getting the next task for host managed_node2 15247 1726867232.20711: done getting next task for host managed_node2 15247 1726867232.20713: ^ task is: TASK: Include the task 'el_repo_setup.yml' 15247 1726867232.20715: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867232.20717: getting variables 15247 1726867232.20718: in VariableManager get_vars() 15247 1726867232.20724: Calling all_inventory to load vars for managed_node2 15247 1726867232.20726: Calling groups_inventory to load vars for managed_node2 15247 1726867232.20728: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867232.20732: Calling all_plugins_play to load vars for managed_node2 15247 1726867232.20734: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867232.20736: Calling groups_plugins_play to load vars for managed_node2 15247 1726867232.20967: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867232.21413: done with get_vars() 15247 1726867232.21421: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:11 Friday 20 September 2024 17:20:32 -0400 (0:00:01.900) 0:00:01.925 ****** 15247 1726867232.21581: entering _queue_task() for managed_node2/include_tasks 15247 1726867232.21583: Creating lock for include_tasks 15247 1726867232.21935: worker is 1 (out of 1 available) 15247 1726867232.21949: exiting _queue_task() for managed_node2/include_tasks 15247 1726867232.21961: done queuing things up, now waiting for results queue to drain 15247 1726867232.21962: waiting for pending results... 15247 1726867232.22218: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 15247 1726867232.22389: in run() - task 0affcac9-a3a5-8ce3-1923-000000000006 15247 1726867232.22392: variable 'ansible_search_path' from source: unknown 15247 1726867232.22395: calling self._execute() 15247 1726867232.22482: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867232.22513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867232.22546: variable 'omit' from source: magic vars 15247 1726867232.22753: _execute() done 15247 1726867232.22757: dumping result to json 15247 1726867232.22759: done dumping result, returning 15247 1726867232.22762: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [0affcac9-a3a5-8ce3-1923-000000000006] 15247 1726867232.22764: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000006 15247 1726867232.22839: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000006 15247 1726867232.22843: WORKER PROCESS EXITING 15247 1726867232.22896: no more pending results, returning what we have 15247 1726867232.22901: in VariableManager get_vars() 15247 1726867232.22934: Calling all_inventory to load vars for managed_node2 15247 1726867232.22936: Calling groups_inventory to load vars for managed_node2 15247 1726867232.22940: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867232.22954: Calling all_plugins_play to load vars for managed_node2 15247 1726867232.22957: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867232.22992: Calling groups_plugins_play to load vars for managed_node2 15247 1726867232.23472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867232.23648: done with get_vars() 15247 1726867232.23656: variable 'ansible_search_path' from source: unknown 15247 1726867232.23669: we have included files to process 15247 1726867232.23670: generating all_blocks data 15247 1726867232.23672: done generating all_blocks data 15247 1726867232.23673: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15247 1726867232.23674: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15247 1726867232.23678: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 15247 1726867232.24334: in VariableManager get_vars() 15247 1726867232.24348: done with get_vars() 15247 1726867232.24365: done processing included file 15247 1726867232.24367: iterating over new_blocks loaded from include file 15247 1726867232.24373: in VariableManager get_vars() 15247 1726867232.24384: done with get_vars() 15247 1726867232.24386: filtering new block on tags 15247 1726867232.24400: done filtering new block on tags 15247 1726867232.24403: in VariableManager get_vars() 15247 1726867232.24413: done with get_vars() 15247 1726867232.24415: filtering new block on tags 15247 1726867232.24430: done filtering new block on tags 15247 1726867232.24433: in VariableManager get_vars() 15247 1726867232.24443: done with get_vars() 15247 1726867232.24444: filtering new block on tags 15247 1726867232.24458: done filtering new block on tags 15247 1726867232.24460: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 15247 1726867232.24471: extending task lists for all hosts with included blocks 15247 1726867232.24522: done extending task lists 15247 1726867232.24523: done processing included files 15247 1726867232.24524: results queue empty 15247 1726867232.24524: checking for any_errors_fatal 15247 1726867232.24525: done checking for any_errors_fatal 15247 1726867232.24526: checking for max_fail_percentage 15247 1726867232.24527: done checking for max_fail_percentage 15247 1726867232.24528: checking to see if all hosts have failed and the running result is not ok 15247 1726867232.24529: done checking to see if all hosts have failed 15247 1726867232.24529: getting the remaining hosts for this loop 15247 1726867232.24530: done getting the remaining hosts for this loop 15247 1726867232.24532: getting the next task for host managed_node2 15247 1726867232.24536: done getting next task for host managed_node2 15247 1726867232.24538: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 15247 1726867232.24540: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867232.24543: getting variables 15247 1726867232.24543: in VariableManager get_vars() 15247 1726867232.24551: Calling all_inventory to load vars for managed_node2 15247 1726867232.24553: Calling groups_inventory to load vars for managed_node2 15247 1726867232.24555: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867232.24559: Calling all_plugins_play to load vars for managed_node2 15247 1726867232.24561: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867232.24564: Calling groups_plugins_play to load vars for managed_node2 15247 1726867232.24728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867232.24925: done with get_vars() 15247 1726867232.24933: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 17:20:32 -0400 (0:00:00.034) 0:00:01.959 ****** 15247 1726867232.24993: entering _queue_task() for managed_node2/setup 15247 1726867232.25355: worker is 1 (out of 1 available) 15247 1726867232.25365: exiting _queue_task() for managed_node2/setup 15247 1726867232.25375: done queuing things up, now waiting for results queue to drain 15247 1726867232.25376: waiting for pending results... 15247 1726867232.25602: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 15247 1726867232.25670: in run() - task 0affcac9-a3a5-8ce3-1923-00000000008f 15247 1726867232.25674: variable 'ansible_search_path' from source: unknown 15247 1726867232.25678: variable 'ansible_search_path' from source: unknown 15247 1726867232.25701: calling self._execute() 15247 1726867232.25776: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867232.25792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867232.25818: variable 'omit' from source: magic vars 15247 1726867232.26343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867232.28535: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867232.28609: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867232.28643: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867232.28718: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867232.28726: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867232.28805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867232.28850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867232.28934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867232.28937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867232.28955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867232.29125: variable 'ansible_facts' from source: unknown 15247 1726867232.29205: variable 'network_test_required_facts' from source: task vars 15247 1726867232.29245: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 15247 1726867232.29266: variable 'omit' from source: magic vars 15247 1726867232.29307: variable 'omit' from source: magic vars 15247 1726867232.29367: variable 'omit' from source: magic vars 15247 1726867232.29374: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867232.29407: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867232.29429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867232.29450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867232.29476: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867232.29508: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867232.29682: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867232.29685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867232.29687: Set connection var ansible_shell_executable to /bin/sh 15247 1726867232.29689: Set connection var ansible_connection to ssh 15247 1726867232.29691: Set connection var ansible_shell_type to sh 15247 1726867232.29693: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867232.29695: Set connection var ansible_timeout to 10 15247 1726867232.29697: Set connection var ansible_pipelining to False 15247 1726867232.29699: variable 'ansible_shell_executable' from source: unknown 15247 1726867232.29701: variable 'ansible_connection' from source: unknown 15247 1726867232.29703: variable 'ansible_module_compression' from source: unknown 15247 1726867232.29705: variable 'ansible_shell_type' from source: unknown 15247 1726867232.29707: variable 'ansible_shell_executable' from source: unknown 15247 1726867232.29708: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867232.29710: variable 'ansible_pipelining' from source: unknown 15247 1726867232.29712: variable 'ansible_timeout' from source: unknown 15247 1726867232.29714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867232.29835: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867232.29848: variable 'omit' from source: magic vars 15247 1726867232.29856: starting attempt loop 15247 1726867232.29863: running the handler 15247 1726867232.29882: _low_level_execute_command(): starting 15247 1726867232.29893: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867232.30604: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867232.30621: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867232.30637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867232.30701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867232.30761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867232.30785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867232.30797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867232.31033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867232.32699: stdout chunk (state=3): >>>/root <<< 15247 1726867232.33385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867232.33388: stdout chunk (state=3): >>><<< 15247 1726867232.33391: stderr chunk (state=3): >>><<< 15247 1726867232.33394: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867232.33404: _low_level_execute_command(): starting 15247 1726867232.33406: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867232.3321695-15375-17512362496251 `" && echo ansible-tmp-1726867232.3321695-15375-17512362496251="` echo /root/.ansible/tmp/ansible-tmp-1726867232.3321695-15375-17512362496251 `" ) && sleep 0' 15247 1726867232.34745: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867232.34758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867232.34769: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867232.34981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867232.34994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867232.35272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867232.35575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867232.37421: stdout chunk (state=3): >>>ansible-tmp-1726867232.3321695-15375-17512362496251=/root/.ansible/tmp/ansible-tmp-1726867232.3321695-15375-17512362496251 <<< 15247 1726867232.37523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867232.37575: stderr chunk (state=3): >>><<< 15247 1726867232.37581: stdout chunk (state=3): >>><<< 15247 1726867232.37597: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867232.3321695-15375-17512362496251=/root/.ansible/tmp/ansible-tmp-1726867232.3321695-15375-17512362496251 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867232.37839: variable 'ansible_module_compression' from source: unknown 15247 1726867232.37883: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15247 1726867232.38115: variable 'ansible_facts' from source: unknown 15247 1726867232.38681: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867232.3321695-15375-17512362496251/AnsiballZ_setup.py 15247 1726867232.39297: Sending initial data 15247 1726867232.39300: Sent initial data (153 bytes) 15247 1726867232.41000: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867232.41032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867232.41264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867232.41267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867232.41816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867232.41820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867232.43466: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867232.43529: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867232.43580: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpuhti7oem /root/.ansible/tmp/ansible-tmp-1726867232.3321695-15375-17512362496251/AnsiballZ_setup.py <<< 15247 1726867232.43595: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867232.3321695-15375-17512362496251/AnsiballZ_setup.py" <<< 15247 1726867232.43634: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpuhti7oem" to remote "/root/.ansible/tmp/ansible-tmp-1726867232.3321695-15375-17512362496251/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867232.3321695-15375-17512362496251/AnsiballZ_setup.py" <<< 15247 1726867232.46879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867232.47188: stderr chunk (state=3): >>><<< 15247 1726867232.47191: stdout chunk (state=3): >>><<< 15247 1726867232.47193: done transferring module to remote 15247 1726867232.47196: _low_level_execute_command(): starting 15247 1726867232.47198: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867232.3321695-15375-17512362496251/ /root/.ansible/tmp/ansible-tmp-1726867232.3321695-15375-17512362496251/AnsiballZ_setup.py && sleep 0' 15247 1726867232.48326: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867232.48396: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867232.48481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867232.48602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867232.48744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867232.48797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867232.51181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867232.51185: stdout chunk (state=3): >>><<< 15247 1726867232.51187: stderr chunk (state=3): >>><<< 15247 1726867232.51190: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867232.51192: _low_level_execute_command(): starting 15247 1726867232.51194: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867232.3321695-15375-17512362496251/AnsiballZ_setup.py && sleep 0' 15247 1726867232.52391: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867232.52395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867232.52397: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15247 1726867232.52399: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867232.52401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867232.52568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867232.52605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867232.52617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867232.52728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867232.55096: stdout chunk (state=3): >>>import _frozen_importlib # frozen<<< 15247 1726867232.55210: stdout chunk (state=3): >>> import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 15247 1726867232.55289: stdout chunk (state=3): >>>import '_io' # <<< 15247 1726867232.55308: stdout chunk (state=3): >>>import 'marshal' # <<< 15247 1726867232.55496: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867232.55531: stdout chunk (state=3): >>>import '_codecs' # <<< 15247 1726867232.55662: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 15247 1726867232.55666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 15247 1726867232.55682: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b6184d0> <<< 15247 1726867232.55695: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b5e7b30> <<< 15247 1726867232.55854: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 15247 1726867232.55857: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b61aa50> import '_signal' # import '_abc' # <<< 15247 1726867232.55991: stdout chunk (state=3): >>>import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 15247 1726867232.56045: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15247 1726867232.56089: stdout chunk (state=3): >>>import 'genericpath' # <<< 15247 1726867232.56104: stdout chunk (state=3): >>>import 'posixpath' # <<< 15247 1726867232.56147: stdout chunk (state=3): >>>import 'os' # <<< 15247 1726867232.56185: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 15247 1726867232.56298: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 15247 1726867232.56302: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 15247 1726867232.56322: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 15247 1726867232.56504: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 15247 1726867232.56508: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b3c9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b3c9fa0> import 'site' # <<< 15247 1726867232.56538: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 15247 1726867232.57174: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15247 1726867232.57199: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 15247 1726867232.57265: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867232.57283: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 15247 1726867232.57388: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b407e90> <<< 15247 1726867232.57492: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15247 1726867232.57807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # <<< 15247 1726867232.58063: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b407f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b43f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b43ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b41fb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b41d280> <<< 15247 1726867232.58460: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b405040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b45f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b45e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 15247 1726867232.58611: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b41e150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b45cc80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b494890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b4042c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b494d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b494bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b494fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b402de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b4956d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b4953a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b4965d0> import 'importlib.util' # import 'runpy' # <<< 15247 1726867232.58689: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b4ac7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b4adeb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 15247 1726867232.58792: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b4aed50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b4af380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b4ae2a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 15247 1726867232.58807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 15247 1726867232.58871: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b4afe00> <<< 15247 1726867232.58899: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b4af530> <<< 15247 1726867232.59284: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b496570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b1a7ce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b1d0740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b1d04a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b1d0770> <<< 15247 1726867232.59287: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 15247 1726867232.59299: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867232.59567: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b1d10a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b1d1a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b1d0950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b1a5e80> <<< 15247 1726867232.59571: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 15247 1726867232.59603: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 15247 1726867232.59607: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 15247 1726867232.59652: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b1d2e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b1d18e0> <<< 15247 1726867232.59667: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b496cc0> <<< 15247 1726867232.59760: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 15247 1726867232.59764: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 15247 1726867232.59802: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15247 1726867232.59825: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b1fb170> <<< 15247 1726867232.59926: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867232.59929: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 15247 1726867232.59984: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 15247 1726867232.59987: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b21f4d0> <<< 15247 1726867232.60009: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 15247 1726867232.60073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15247 1726867232.60090: stdout chunk (state=3): >>>import 'ntpath' # <<< 15247 1726867232.60132: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b2802f0> <<< 15247 1726867232.60337: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15247 1726867232.60341: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b282a20> <<< 15247 1726867232.60460: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b2803e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b2452e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11ab293d0> <<< 15247 1726867232.60475: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b21e300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b1d3d40> <<< 15247 1726867232.60668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15247 1726867232.60897: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe11b21e660> <<< 15247 1726867232.61051: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_8ovjc3zd/ansible_setup_payload.zip' # zipimport: zlib available <<< 15247 1726867232.61422: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 15247 1726867232.61426: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11ab930e0> <<< 15247 1726867232.61489: stdout chunk (state=3): >>>import '_typing' # <<< 15247 1726867232.62478: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11ab71fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11ab71160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 15247 1726867232.64172: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.65305: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11ab91760> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867232.65309: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 15247 1726867232.65518: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11abc2a80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11abc2810> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11abc2120> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11abc2b40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b61a9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11abc37a0> <<< 15247 1726867232.65623: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11abc3980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 15247 1726867232.65709: stdout chunk (state=3): >>>import '_locale' # <<< 15247 1726867232.66036: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11abc3ec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa2dc40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11aa2f830> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa30230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa313d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 15247 1726867232.66140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15247 1726867232.66163: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa33ec0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b402ed0> <<< 15247 1726867232.66190: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa32180> <<< 15247 1726867232.66256: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15247 1726867232.66266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 15247 1726867232.66327: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 15247 1726867232.66466: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 15247 1726867232.66594: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa3bda0> import '_tokenize' # <<< 15247 1726867232.66621: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa3a870> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa3a5d0><<< 15247 1726867232.66655: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py<<< 15247 1726867232.66666: stdout chunk (state=3): >>> <<< 15247 1726867232.66790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15247 1726867232.66822: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa3ab40> <<< 15247 1726867232.66898: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa32690> <<< 15247 1726867232.66938: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867232.66955: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867232.66968: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11aa7fa40> <<< 15247 1726867232.67016: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 15247 1726867232.67069: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa801d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 15247 1726867232.67117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 15247 1726867232.67159: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py<<< 15247 1726867232.67163: stdout chunk (state=3): >>> <<< 15247 1726867232.67298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11aa81c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa81a00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 15247 1726867232.67346: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 15247 1726867232.67434: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11aa84110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa822d0><<< 15247 1726867232.67484: stdout chunk (state=3): >>> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15247 1726867232.67562: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc'<<< 15247 1726867232.67585: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 15247 1726867232.67634: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 15247 1726867232.67761: stdout chunk (state=3): >>> import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa87860> <<< 15247 1726867232.67935: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa84230> <<< 15247 1726867232.68273: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11aa88650> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11aa88680> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11aa88b90> <<< 15247 1726867232.68280: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa80380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 15247 1726867232.68325: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 15247 1726867232.68413: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867232.68599: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867232.68696: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a9141d0> <<< 15247 1726867232.68745: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867232.68802: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a915460> <<< 15247 1726867232.68835: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa8a960> <<< 15247 1726867232.68893: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867232.69035: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11aa8bd10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa8a600> # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.69049: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 15247 1726867232.69091: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.69245: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.69599: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available<<< 15247 1726867232.69798: stdout chunk (state=3): >>> # zipimport: zlib available <<< 15247 1726867232.69931: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.70888: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.71816: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 15247 1726867232.71874: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 15247 1726867232.71981: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 15247 1726867232.72015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867232.72101: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 15247 1726867232.72317: stdout chunk (state=3): >>> # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a919610> <<< 15247 1726867232.72333: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a91a480> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a915580> <<< 15247 1726867232.72591: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 15247 1726867232.72897: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.73032: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 15247 1726867232.73135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a91a4e0> # zipimport: zlib available <<< 15247 1726867232.74082: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.74367: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.74960: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 15247 1726867232.75000: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.75100: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 15247 1726867232.75396: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.75831: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15247 1726867232.75849: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 15247 1726867232.75862: stdout chunk (state=3): >>>import '_ast' # <<< 15247 1726867232.75960: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a91b620> <<< 15247 1726867232.76083: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.76173: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 15247 1726867232.76271: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.76318: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 15247 1726867232.76332: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.76388: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.76520: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.76616: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15247 1726867232.76670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867232.76901: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a926120> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a923e00> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 15247 1726867232.76994: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.77068: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.77097: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.77151: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867232.77533: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa0ea50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11abee720> <<< 15247 1726867232.77597: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a926240> <<< 15247 1726867232.77610: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa88c80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 15247 1726867232.77637: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.77665: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 15247 1726867232.77680: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 15247 1726867232.77743: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 15247 1726867232.77769: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 15247 1726867232.77873: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.77951: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.77988: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.78051: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.78125: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.78263: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 15247 1726867232.78389: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.78487: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.78497: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 15247 1726867232.78772: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.79091: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.79182: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 15247 1726867232.79323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a9b6120> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 15247 1726867232.79501: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a5cbf50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a5d0440> <<< 15247 1726867232.79565: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a99eff0> <<< 15247 1726867232.79676: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a9b6c90> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a9b4800> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a9b43b0> <<< 15247 1726867232.79701: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 15247 1726867232.79729: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 15247 1726867232.79755: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 15247 1726867232.79848: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a5d3380> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a5d2c30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a5d2e10> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a5d2060> <<< 15247 1726867232.79893: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 15247 1726867232.80081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a5d3500> <<< 15247 1726867232.80114: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 15247 1726867232.80148: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a62dfd0> <<< 15247 1726867232.80232: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a5d3fb0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a9b4500> import 'ansible.module_utils.facts.timeout' # <<< 15247 1726867232.80697: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 15247 1726867232.80718: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 15247 1726867232.80748: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.80826: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 15247 1726867232.81035: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.81084: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.81154: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.81237: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 15247 1726867232.81255: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 15247 1726867232.82233: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.82700: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 15247 1726867232.82713: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.82773: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.82857: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.82906: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.82990: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 15247 1726867232.83103: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.83175: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 15247 1726867232.83281: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 15247 1726867232.83315: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.83351: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 15247 1726867232.83459: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.83583: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 15247 1726867232.83618: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a62e120> <<< 15247 1726867232.83754: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 15247 1726867232.83859: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a62edb0> <<< 15247 1726867232.83882: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 15247 1726867232.83973: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.84105: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 15247 1726867232.84182: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.84322: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 15247 1726867232.84497: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # <<< 15247 1726867232.84510: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.84597: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.84642: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 15247 1726867232.84699: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 15247 1726867232.84798: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867232.84870: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a66e3c0> <<< 15247 1726867232.85298: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a65f1d0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.85330: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 15247 1726867232.85568: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.85893: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.85963: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 15247 1726867232.86216: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 15247 1726867232.86239: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867232.86261: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a681f10> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a65f350> <<< 15247 1726867232.86263: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # <<< 15247 1726867232.86288: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.86300: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 15247 1726867232.86355: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.86493: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 15247 1726867232.86652: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.87413: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 15247 1726867232.87417: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.87419: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.87421: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.87694: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.87758: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 15247 1726867232.87774: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 15247 1726867232.87948: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.88197: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.88223: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.89109: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.89958: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 15247 1726867232.90078: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.90226: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 15247 1726867232.90242: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.90373: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.90529: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 15247 1726867232.91016: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 15247 1726867232.91022: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 15247 1726867232.91035: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.91079: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.91154: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 15247 1726867232.91318: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.91433: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.91763: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.92047: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 15247 1726867232.92051: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 15247 1726867232.92053: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.92144: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.92147: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 15247 1726867232.92209: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.92321: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.92421: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 15247 1726867232.92428: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.92474: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 15247 1726867232.92585: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.92995: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 15247 1726867232.93209: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.93654: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 15247 1726867232.93657: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.93775: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.93959: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # <<< 15247 1726867232.94098: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 15247 1726867232.94172: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.94291: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available <<< 15247 1726867232.94395: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 15247 1726867232.94398: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.94441: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 15247 1726867232.94449: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.94543: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.94554: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.94716: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.94971: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867232.95052: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 15247 1726867232.95279: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.95568: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 15247 1726867232.95629: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.95717: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 15247 1726867232.95767: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.95842: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 15247 1726867232.95988: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.96134: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 15247 1726867232.96138: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.96190: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.96310: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 15247 1726867232.96349: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 15247 1726867232.96573: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867232.97366: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 15247 1726867232.97392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 15247 1726867232.97427: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 15247 1726867232.97495: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867232.97499: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a4870b0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a485250> <<< 15247 1726867232.97873: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a484dd0> <<< 15247 1726867232.98398: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "20", "second": "32", "epoch": "1726867232", "epoch_int": "1726867232", "date": "2024-09-20", "time": "17:20:32", "iso8601_micro": "2024-09-20T21:20:32.971364Z", "iso8601": "2024-09-20T21:20:32Z", "iso8601_basic": "20240920T172032971364", "iso8601_basic_short": "20240920T172032", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15247 1726867232.99110: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref <<< 15247 1726867232.99249: stdout chunk (state=3): >>># cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd <<< 15247 1726867232.99258: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network <<< 15247 1726867232.99384: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 15247 1726867232.99592: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 15247 1726867232.99668: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 15247 1726867232.99799: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 15247 1726867232.99802: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 15247 1726867232.99873: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal <<< 15247 1726867233.00049: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json <<< 15247 1726867233.00053: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 15247 1726867233.00431: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 15247 1726867233.00519: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections <<< 15247 1726867233.00522: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 15247 1726867233.00614: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing <<< 15247 1726867233.00618: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 15247 1726867233.00699: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 15247 1726867233.00724: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 15247 1726867233.00874: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 15247 1726867233.00888: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 15247 1726867233.01317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867233.01436: stderr chunk (state=3): >>><<< 15247 1726867233.01440: stdout chunk (state=3): >>><<< 15247 1726867233.01714: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b6184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b5e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b61aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b3c9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b3c9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b407e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b407f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b43f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b43ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b41fb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b41d280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b405040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b45f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b45e420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b41e150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b45cc80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b494890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b4042c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b494d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b494bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b494fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b402de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b4956d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b4953a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b4965d0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b4ac7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b4adeb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b4aed50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b4af380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b4ae2a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b4afe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b4af530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b496570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b1a7ce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b1d0740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b1d04a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b1d0770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b1d10a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b1d1a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b1d0950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b1a5e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b1d2e10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b1d18e0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b496cc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b1fb170> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b21f4d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b2802f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b282a20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b2803e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b2452e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11ab293d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b21e300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b1d3d40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe11b21e660> # zipimport: found 103 names in '/tmp/ansible_setup_payload_8ovjc3zd/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11ab930e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11ab71fd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11ab71160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11ab91760> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11abc2a80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11abc2810> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11abc2120> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11abc2b40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11b61a9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11abc37a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11abc3980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11abc3ec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa2dc40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11aa2f830> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa30230> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa313d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa33ec0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11b402ed0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa32180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa3bda0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa3a870> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa3a5d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa3ab40> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa32690> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11aa7fa40> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa801d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11aa81c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa81a00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11aa84110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa822d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa87860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa84230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11aa88650> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11aa88680> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11aa88b90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa80380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a9141d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a915460> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa8a960> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11aa8bd10> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa8a600> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a919610> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a91a480> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a915580> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a91a4e0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a91b620> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a926120> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a923e00> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa0ea50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11abee720> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a926240> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11aa88c80> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a9b6120> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a5cbf50> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a5d0440> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a99eff0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a9b6c90> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a9b4800> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a9b43b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a5d3380> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a5d2c30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a5d2e10> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a5d2060> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a5d3500> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a62dfd0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a5d3fb0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a9b4500> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a62e120> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a62edb0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a66e3c0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a65f1d0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a681f10> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a65f350> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe11a4870b0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a485250> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe11a484dd0> {"ansible_facts": {"ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "20", "second": "32", "epoch": "1726867232", "epoch_int": "1726867232", "date": "2024-09-20", "time": "17:20:32", "iso8601_micro": "2024-09-20T21:20:32.971364Z", "iso8601": "2024-09-20T21:20:32Z", "iso8601_basic": "20240920T172032971364", "iso8601_basic_short": "20240920T172032", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15247 1726867233.04888: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867232.3321695-15375-17512362496251/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867233.04891: _low_level_execute_command(): starting 15247 1726867233.04893: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867232.3321695-15375-17512362496251/ > /dev/null 2>&1 && sleep 0' 15247 1726867233.04896: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867233.04898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867233.04901: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15247 1726867233.04903: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867233.04905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867233.04907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867233.04909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867233.04911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867233.06222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867233.06282: stderr chunk (state=3): >>><<< 15247 1726867233.06385: stdout chunk (state=3): >>><<< 15247 1726867233.06389: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867233.06392: handler run complete 15247 1726867233.06443: variable 'ansible_facts' from source: unknown 15247 1726867233.06792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867233.06846: variable 'ansible_facts' from source: unknown 15247 1726867233.06990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867233.07056: attempt loop complete, returning result 15247 1726867233.07123: _execute() done 15247 1726867233.07131: dumping result to json 15247 1726867233.07148: done dumping result, returning 15247 1726867233.07162: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcac9-a3a5-8ce3-1923-00000000008f] 15247 1726867233.07173: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000008f ok: [managed_node2] 15247 1726867233.07532: no more pending results, returning what we have 15247 1726867233.07535: results queue empty 15247 1726867233.07536: checking for any_errors_fatal 15247 1726867233.07538: done checking for any_errors_fatal 15247 1726867233.07539: checking for max_fail_percentage 15247 1726867233.07541: done checking for max_fail_percentage 15247 1726867233.07541: checking to see if all hosts have failed and the running result is not ok 15247 1726867233.07542: done checking to see if all hosts have failed 15247 1726867233.07543: getting the remaining hosts for this loop 15247 1726867233.07544: done getting the remaining hosts for this loop 15247 1726867233.07548: getting the next task for host managed_node2 15247 1726867233.07558: done getting next task for host managed_node2 15247 1726867233.07561: ^ task is: TASK: Check if system is ostree 15247 1726867233.07564: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867233.07567: getting variables 15247 1726867233.07569: in VariableManager get_vars() 15247 1726867233.08001: Calling all_inventory to load vars for managed_node2 15247 1726867233.08003: Calling groups_inventory to load vars for managed_node2 15247 1726867233.08007: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867233.08017: Calling all_plugins_play to load vars for managed_node2 15247 1726867233.08020: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867233.08024: Calling groups_plugins_play to load vars for managed_node2 15247 1726867233.08576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867233.09310: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000008f 15247 1726867233.09314: WORKER PROCESS EXITING 15247 1726867233.09340: done with get_vars() 15247 1726867233.09350: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 17:20:33 -0400 (0:00:00.846) 0:00:02.806 ****** 15247 1726867233.09663: entering _queue_task() for managed_node2/stat 15247 1726867233.10339: worker is 1 (out of 1 available) 15247 1726867233.10350: exiting _queue_task() for managed_node2/stat 15247 1726867233.10362: done queuing things up, now waiting for results queue to drain 15247 1726867233.10363: waiting for pending results... 15247 1726867233.10789: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 15247 1726867233.10975: in run() - task 0affcac9-a3a5-8ce3-1923-000000000091 15247 1726867233.11070: variable 'ansible_search_path' from source: unknown 15247 1726867233.11080: variable 'ansible_search_path' from source: unknown 15247 1726867233.11133: calling self._execute() 15247 1726867233.11214: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867233.11227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867233.11245: variable 'omit' from source: magic vars 15247 1726867233.11852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867233.12589: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867233.12592: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867233.12595: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867233.12732: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867233.12916: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867233.13053: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867233.13094: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867233.13315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867233.13538: Evaluated conditional (not __network_is_ostree is defined): True 15247 1726867233.13566: variable 'omit' from source: magic vars 15247 1726867233.13687: variable 'omit' from source: magic vars 15247 1726867233.13690: variable 'omit' from source: magic vars 15247 1726867233.13729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867233.13788: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867233.13817: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867233.13840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867233.13858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867233.13897: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867233.13914: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867233.13933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867233.14111: Set connection var ansible_shell_executable to /bin/sh 15247 1726867233.14115: Set connection var ansible_connection to ssh 15247 1726867233.14122: Set connection var ansible_shell_type to sh 15247 1726867233.14124: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867233.14127: Set connection var ansible_timeout to 10 15247 1726867233.14129: Set connection var ansible_pipelining to False 15247 1726867233.14131: variable 'ansible_shell_executable' from source: unknown 15247 1726867233.14133: variable 'ansible_connection' from source: unknown 15247 1726867233.14135: variable 'ansible_module_compression' from source: unknown 15247 1726867233.14137: variable 'ansible_shell_type' from source: unknown 15247 1726867233.14141: variable 'ansible_shell_executable' from source: unknown 15247 1726867233.14150: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867233.14159: variable 'ansible_pipelining' from source: unknown 15247 1726867233.14166: variable 'ansible_timeout' from source: unknown 15247 1726867233.14173: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867233.14340: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867233.14357: variable 'omit' from source: magic vars 15247 1726867233.14367: starting attempt loop 15247 1726867233.14374: running the handler 15247 1726867233.14438: _low_level_execute_command(): starting 15247 1726867233.14441: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867233.15171: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867233.15196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867233.15217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867233.15291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15247 1726867233.17081: stdout chunk (state=3): >>>/root <<< 15247 1726867233.17557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867233.17559: stdout chunk (state=3): >>><<< 15247 1726867233.17561: stderr chunk (state=3): >>><<< 15247 1726867233.17563: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15247 1726867233.17572: _low_level_execute_command(): starting 15247 1726867233.17575: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867233.1737933-15406-159997375528023 `" && echo ansible-tmp-1726867233.1737933-15406-159997375528023="` echo /root/.ansible/tmp/ansible-tmp-1726867233.1737933-15406-159997375528023 `" ) && sleep 0' 15247 1726867233.18825: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867233.18828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867233.18830: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867233.18833: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867233.18835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867233.19037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867233.19212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867233.19496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867233.21313: stdout chunk (state=3): >>>ansible-tmp-1726867233.1737933-15406-159997375528023=/root/.ansible/tmp/ansible-tmp-1726867233.1737933-15406-159997375528023 <<< 15247 1726867233.21512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867233.21516: stdout chunk (state=3): >>><<< 15247 1726867233.21518: stderr chunk (state=3): >>><<< 15247 1726867233.21813: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867233.1737933-15406-159997375528023=/root/.ansible/tmp/ansible-tmp-1726867233.1737933-15406-159997375528023 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867233.21816: variable 'ansible_module_compression' from source: unknown 15247 1726867233.21906: ANSIBALLZ: Using lock for stat 15247 1726867233.21916: ANSIBALLZ: Acquiring lock 15247 1726867233.21926: ANSIBALLZ: Lock acquired: 140393878429504 15247 1726867233.21935: ANSIBALLZ: Creating module 15247 1726867233.58451: ANSIBALLZ: Writing module into payload 15247 1726867233.58684: ANSIBALLZ: Writing module 15247 1726867233.58752: ANSIBALLZ: Renaming module 15247 1726867233.58764: ANSIBALLZ: Done creating module 15247 1726867233.58791: variable 'ansible_facts' from source: unknown 15247 1726867233.58983: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867233.1737933-15406-159997375528023/AnsiballZ_stat.py 15247 1726867233.59415: Sending initial data 15247 1726867233.59418: Sent initial data (153 bytes) 15247 1726867233.61349: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867233.61395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 15247 1726867233.61410: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867233.61456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867233.61483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867233.61496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867233.61700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867233.61789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867233.63923: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867233.64143: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867233.64246: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp26nq0_f7 /root/.ansible/tmp/ansible-tmp-1726867233.1737933-15406-159997375528023/AnsiballZ_stat.py <<< 15247 1726867233.64254: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867233.1737933-15406-159997375528023/AnsiballZ_stat.py" <<< 15247 1726867233.64258: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp26nq0_f7" to remote "/root/.ansible/tmp/ansible-tmp-1726867233.1737933-15406-159997375528023/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867233.1737933-15406-159997375528023/AnsiballZ_stat.py" <<< 15247 1726867233.65200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867233.65391: stderr chunk (state=3): >>><<< 15247 1726867233.65395: stdout chunk (state=3): >>><<< 15247 1726867233.65400: done transferring module to remote 15247 1726867233.65402: _low_level_execute_command(): starting 15247 1726867233.65408: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867233.1737933-15406-159997375528023/ /root/.ansible/tmp/ansible-tmp-1726867233.1737933-15406-159997375528023/AnsiballZ_stat.py && sleep 0' 15247 1726867233.66897: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867233.66919: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15247 1726867233.66957: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867233.67175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867233.67200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867233.67294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867233.69594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867233.69637: stderr chunk (state=3): >>><<< 15247 1726867233.69648: stdout chunk (state=3): >>><<< 15247 1726867233.69673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867233.69687: _low_level_execute_command(): starting 15247 1726867233.69695: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867233.1737933-15406-159997375528023/AnsiballZ_stat.py && sleep 0' 15247 1726867233.70456: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867233.70459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867233.70633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867233.72918: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 15247 1726867233.72978: stdout chunk (state=3): >>>import _imp # builtin <<< 15247 1726867233.73003: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 15247 1726867233.73097: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 15247 1726867233.73148: stdout chunk (state=3): >>>import 'posix' # <<< 15247 1726867233.73197: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 15247 1726867233.73259: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 15247 1726867233.73315: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867233.73356: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 15247 1726867233.73402: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 15247 1726867233.73438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 15247 1726867233.73456: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64923b84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492387b30> <<< 15247 1726867233.73491: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64923baa50> <<< 15247 1726867233.73543: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 15247 1726867233.73627: stdout chunk (state=3): >>>import 'abc' # <<< 15247 1726867233.73641: stdout chunk (state=3): >>>import 'io' # <<< 15247 1726867233.73660: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 15247 1726867233.73754: stdout chunk (state=3): >>>import '_collections_abc' # <<< 15247 1726867233.73801: stdout chunk (state=3): >>>import 'genericpath' # <<< 15247 1726867233.73848: stdout chunk (state=3): >>>import 'posixpath' # import 'os' # <<< 15247 1726867233.73910: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 15247 1726867233.73945: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages'<<< 15247 1726867233.73992: stdout chunk (state=3): >>> Adding directory: '/usr/lib/python3.12/site-packages' <<< 15247 1726867233.74051: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 15247 1726867233.74065: stdout chunk (state=3): >>> import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492169130> <<< 15247 1726867233.74233: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 15247 1726867233.74236: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492169fa0> import 'site' # <<< 15247 1726867233.74265: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux<<< 15247 1726867233.74335: stdout chunk (state=3): >>> Type "help", "copyright", "credits" or "license" for more information.<<< 15247 1726867233.74553: stdout chunk (state=3): >>> <<< 15247 1726867233.74667: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 15247 1726867233.74687: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc'<<< 15247 1726867233.74706: stdout chunk (state=3): >>> <<< 15247 1726867233.74786: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py<<< 15247 1726867233.74806: stdout chunk (state=3): >>> <<< 15247 1726867233.74849: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc'<<< 15247 1726867233.74904: stdout chunk (state=3): >>> <<< 15247 1726867233.75006: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 15247 1726867233.75012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921a7e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 15247 1726867233.75033: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 15247 1726867233.75087: stdout chunk (state=3): >>>import '_operator' # <<< 15247 1726867233.75130: stdout chunk (state=3): >>> import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921a7f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 15247 1726867233.75167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc'<<< 15247 1726867233.75244: stdout chunk (state=3): >>> <<< 15247 1726867233.75256: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 15247 1726867233.75355: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc'<<< 15247 1726867233.75380: stdout chunk (state=3): >>> import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py<<< 15247 1726867233.75418: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921df890><<< 15247 1726867233.75483: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 15247 1726867233.75486: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921dff20><<< 15247 1726867233.75568: stdout chunk (state=3): >>> import '_collections' # <<< 15247 1726867233.75594: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921bfb60> import '_functools' # <<< 15247 1726867233.75696: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921bd280> <<< 15247 1726867233.75779: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921a5040><<< 15247 1726867233.75836: stdout chunk (state=3): >>> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 15247 1726867233.75925: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 15247 1726867233.75928: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py<<< 15247 1726867233.76104: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 15247 1726867233.76108: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921ff800> <<< 15247 1726867233.76124: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921fe420> <<< 15247 1726867233.76154: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921be150><<< 15247 1726867233.76181: stdout chunk (state=3): >>> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921fcc80><<< 15247 1726867233.76311: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492234890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921a42c0> <<< 15247 1726867233.76339: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py<<< 15247 1726867233.76393: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc'<<< 15247 1726867233.76536: stdout chunk (state=3): >>> # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6492234d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492234bf0><<< 15247 1726867233.76539: stdout chunk (state=3): >>> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6492234fe0> <<< 15247 1726867233.76564: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921a2de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py<<< 15247 1726867233.76618: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867233.76633: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py<<< 15247 1726867233.76705: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64922356d0> <<< 15247 1726867233.76731: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64922353a0> import 'importlib.machinery' # <<< 15247 1726867233.76773: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc'<<< 15247 1726867233.76829: stdout chunk (state=3): >>> import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64922365d0><<< 15247 1726867233.76860: stdout chunk (state=3): >>> import 'importlib.util' # import 'runpy' # <<< 15247 1726867233.77129: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649224c7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867233.77135: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f649224deb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 15247 1726867233.77149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc'<<< 15247 1726867233.77185: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 15247 1726867233.77217: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc'<<< 15247 1726867233.77235: stdout chunk (state=3): >>> import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649224ed50> <<< 15247 1726867233.77335: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f649224f380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649224e2a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 15247 1726867233.77355: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc'<<< 15247 1726867233.77409: stdout chunk (state=3): >>> # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867233.77462: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f649224fe00> <<< 15247 1726867233.77466: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649224f530><<< 15247 1726867233.77538: stdout chunk (state=3): >>> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492236570> <<< 15247 1726867233.77567: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 15247 1726867233.77612: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 15247 1726867233.77656: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 15247 1726867233.77772: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491fcfce0><<< 15247 1726867233.77974: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491ff8740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491ff84a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491ff8770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867233.78155: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 15247 1726867233.78294: stdout chunk (state=3): >>> import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491ff90a0> <<< 15247 1726867233.78373: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867233.78414: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491ff9a60><<< 15247 1726867233.78439: stdout chunk (state=3): >>> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491ff8950> <<< 15247 1726867233.78470: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491fcde80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py<<< 15247 1726867233.78535: stdout chunk (state=3): >>> <<< 15247 1726867233.78565: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 15247 1726867233.78595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491ffae10> <<< 15247 1726867233.78653: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491ff98e0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492236cc0> <<< 15247 1726867233.78693: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py<<< 15247 1726867233.78873: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867233.78889: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 15247 1726867233.78932: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492023170> <<< 15247 1726867233.79090: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py<<< 15247 1726867233.79120: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc'<<< 15247 1726867233.79133: stdout chunk (state=3): >>> <<< 15247 1726867233.79193: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64920474d0> <<< 15247 1726867233.79215: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py<<< 15247 1726867233.79304: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 15247 1726867233.79384: stdout chunk (state=3): >>>import 'ntpath' # <<< 15247 1726867233.79450: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867233.79482: stdout chunk (state=3): >>>import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64920a82f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 15247 1726867233.79556: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 15247 1726867233.79631: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 15247 1726867233.79774: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64920aaa20> <<< 15247 1726867233.79887: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64920a83e0><<< 15247 1726867233.79944: stdout chunk (state=3): >>> <<< 15247 1726867233.80052: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649206d2e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64919293d0><<< 15247 1726867233.80081: stdout chunk (state=3): >>> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492046300> <<< 15247 1726867233.80201: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491ffbd40> <<< 15247 1726867233.80227: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 15247 1726867233.80288: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6492046660> <<< 15247 1726867233.80513: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_lodzmoqg/ansible_stat_payload.zip' # zipimport: zlib available <<< 15247 1726867233.80730: stdout chunk (state=3): >>># zipimport: zlib available<<< 15247 1726867233.80754: stdout chunk (state=3): >>> <<< 15247 1726867233.80865: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 15247 1726867233.80986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 15247 1726867233.81096: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649197f0e0> import '_typing' # <<< 15247 1726867233.81345: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649195dfd0> <<< 15247 1726867233.81374: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649195d160> # zipimport: zlib available<<< 15247 1726867233.81447: stdout chunk (state=3): >>> import 'ansible' # # zipimport: zlib available <<< 15247 1726867233.81531: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 15247 1726867233.81589: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.83679: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.85493: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649197d760> <<< 15247 1726867233.85516: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867233.85562: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 15247 1726867233.85684: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64919a6a80> <<< 15247 1726867233.85687: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64919a6810> <<< 15247 1726867233.85899: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64919a6120> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 15247 1726867233.85942: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64919a6570> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64923ba9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64919a77d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64919a7a10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 15247 1726867233.85963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 15247 1726867233.86034: stdout chunk (state=3): >>>import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64919a7f20> <<< 15247 1726867233.86067: stdout chunk (state=3): >>>import 'pwd' # <<< 15247 1726867233.86099: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 15247 1726867233.86106: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 15247 1726867233.86233: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491811bb0> <<< 15247 1726867233.86239: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491813860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 15247 1726867233.86336: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491814260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 15247 1726867233.86503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491815130> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 15247 1726867233.86553: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491817e60> <<< 15247 1726867233.86632: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491817dd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491816120> <<< 15247 1726867233.86680: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 15247 1726867233.86780: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 15247 1726867233.86806: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 15247 1726867233.86834: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649181fd70> import '_tokenize' # <<< 15247 1726867233.87018: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649181e840> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649181e5a0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 15247 1726867233.87165: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649181eb10> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491816630> <<< 15247 1726867233.87170: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491867a40> <<< 15247 1726867233.87288: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491868170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 15247 1726867233.87672: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491869c10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64918699d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f649186c1a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649186a300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 15247 1726867233.87706: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 15247 1726867233.88182: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 15247 1726867233.88188: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649186f950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649186c320> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491870770> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64918709e0> <<< 15247 1726867233.88240: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491870c50> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64918682f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 15247 1726867233.88246: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 15247 1726867233.88287: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867233.88319: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64918fc3e0> <<< 15247 1726867233.88571: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64918fd760> <<< 15247 1726867233.88685: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491872ba0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491873f20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64918727b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 15247 1726867233.88827: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.89018: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 15247 1726867233.89024: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.89386: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867233.90315: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.91483: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867233.91486: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491701a30> <<< 15247 1726867233.91591: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491702750> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649181fd40> <<< 15247 1726867233.91625: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 15247 1726867233.91653: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.91659: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.91688: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 15247 1726867233.91694: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.91939: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.92168: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 15247 1726867233.92185: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 15247 1726867233.92190: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64917026f0> <<< 15247 1726867233.92385: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.93059: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.93797: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.94027: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.94031: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 15247 1726867233.94037: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.94039: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.94156: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 15247 1726867233.94160: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.94162: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.94592: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 15247 1726867233.94989: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.95199: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 15247 1726867233.95297: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 15247 1726867233.95303: stdout chunk (state=3): >>>import '_ast' # <<< 15247 1726867233.95410: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64917039b0> <<< 15247 1726867233.95417: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.95525: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.95634: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 15247 1726867233.95642: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 15247 1726867233.95646: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 15247 1726867233.95682: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.95892: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867233.95925: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.96110: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 15247 1726867233.96165: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 15247 1726867233.96293: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 15247 1726867233.96299: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f649170e480> <<< 15247 1726867233.96341: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491709250> <<< 15247 1726867233.96401: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 15247 1726867233.96758: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 15247 1726867233.96868: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 15247 1726867233.96884: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 15247 1726867233.96890: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 15247 1726867233.96983: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64919fec00> <<< 15247 1726867233.97089: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64919ee8d0> <<< 15247 1726867233.97158: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649170e150> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491704500> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 15247 1726867233.97211: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 15247 1726867233.97240: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 15247 1726867233.97320: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 15247 1726867233.97326: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.97412: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 15247 1726867233.97883: stdout chunk (state=3): >>># zipimport: zlib available <<< 15247 1726867233.98064: stdout chunk (state=3): >>># zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 15247 1726867233.98575: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 15247 1726867233.98785: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases <<< 15247 1726867233.98793: stdout chunk (state=3): >>># cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site <<< 15247 1726867233.98917: stdout chunk (state=3): >>># destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat <<< 15247 1726867233.98921: stdout chunk (state=3): >>># destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 15247 1726867233.99283: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma <<< 15247 1726867233.99298: stdout chunk (state=3): >>># destroy zipfile._path <<< 15247 1726867233.99496: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 15247 1726867233.99528: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array # destroy datetime <<< 15247 1726867233.99568: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 15247 1726867233.99572: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse<<< 15247 1726867233.99574: stdout chunk (state=3): >>> # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 15247 1726867233.99996: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket <<< 15247 1726867234.00002: stdout chunk (state=3): >>># destroy _collections <<< 15247 1726867234.00034: stdout chunk (state=3): >>># destroy platform <<< 15247 1726867234.00039: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 15247 1726867234.00075: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 15247 1726867234.00120: stdout chunk (state=3): >>># destroy _typing <<< 15247 1726867234.00124: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 15247 1726867234.00429: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re <<< 15247 1726867234.00432: stdout chunk (state=3): >>># destroy itertools <<< 15247 1726867234.00435: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins <<< 15247 1726867234.00437: stdout chunk (state=3): >>># destroy _thread <<< 15247 1726867234.00783: stdout chunk (state=3): >>># clear sys.audit hooks <<< 15247 1726867234.01084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867234.01087: stdout chunk (state=3): >>><<< 15247 1726867234.01095: stderr chunk (state=3): >>><<< 15247 1726867234.01191: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64923b84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492387b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64923baa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492169130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492169fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921a7e90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921a7f50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921df890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921dff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921bfb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921bd280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921a5040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921ff800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921fe420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921be150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921fcc80> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492234890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921a42c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6492234d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492234bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6492234fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64921a2de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64922356d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64922353a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64922365d0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649224c7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f649224deb0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649224ed50> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f649224f380> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649224e2a0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f649224fe00> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649224f530> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492236570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491fcfce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491ff8740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491ff84a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491ff8770> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491ff90a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491ff9a60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491ff8950> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491fcde80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491ffae10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491ff98e0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492236cc0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492023170> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64920474d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64920a82f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64920aaa20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64920a83e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649206d2e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64919293d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6492046300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491ffbd40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6492046660> # zipimport: found 30 names in '/tmp/ansible_stat_payload_lodzmoqg/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649197f0e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649195dfd0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649195d160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649197d760> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64919a6a80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64919a6810> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64919a6120> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64919a6570> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64923ba9c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64919a77d0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64919a7a10> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64919a7f20> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491811bb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491813860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491814260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491815130> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491817e60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491817dd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491816120> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649181fd70> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649181e840> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649181e5a0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649181eb10> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491816630> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491867a40> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491868170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491869c10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64918699d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f649186c1a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649186a300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649186f950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649186c320> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491870770> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64918709e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491870c50> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64918682f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64918fc3e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f64918fd760> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491872ba0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491873f20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64918727b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6491701a30> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491702750> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649181fd40> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64917026f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64917039b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f649170e480> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491709250> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64919fec00> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f64919ee8d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f649170e150> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6491704500> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 15247 1726867234.02680: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867233.1737933-15406-159997375528023/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867234.02684: _low_level_execute_command(): starting 15247 1726867234.02687: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867233.1737933-15406-159997375528023/ > /dev/null 2>&1 && sleep 0' 15247 1726867234.02893: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867234.02896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867234.02944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867234.02956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867234.03017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867234.03132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867234.05745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867234.05748: stdout chunk (state=3): >>><<< 15247 1726867234.05750: stderr chunk (state=3): >>><<< 15247 1726867234.05752: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867234.05763: handler run complete 15247 1726867234.05993: attempt loop complete, returning result 15247 1726867234.05996: _execute() done 15247 1726867234.05998: dumping result to json 15247 1726867234.06001: done dumping result, returning 15247 1726867234.06003: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [0affcac9-a3a5-8ce3-1923-000000000091] 15247 1726867234.06008: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000091 ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 15247 1726867234.06135: no more pending results, returning what we have 15247 1726867234.06138: results queue empty 15247 1726867234.06139: checking for any_errors_fatal 15247 1726867234.06146: done checking for any_errors_fatal 15247 1726867234.06147: checking for max_fail_percentage 15247 1726867234.06148: done checking for max_fail_percentage 15247 1726867234.06149: checking to see if all hosts have failed and the running result is not ok 15247 1726867234.06150: done checking to see if all hosts have failed 15247 1726867234.06150: getting the remaining hosts for this loop 15247 1726867234.06151: done getting the remaining hosts for this loop 15247 1726867234.06155: getting the next task for host managed_node2 15247 1726867234.06161: done getting next task for host managed_node2 15247 1726867234.06163: ^ task is: TASK: Set flag to indicate system is ostree 15247 1726867234.06165: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867234.06168: getting variables 15247 1726867234.06169: in VariableManager get_vars() 15247 1726867234.06201: Calling all_inventory to load vars for managed_node2 15247 1726867234.06204: Calling groups_inventory to load vars for managed_node2 15247 1726867234.06207: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867234.06220: Calling all_plugins_play to load vars for managed_node2 15247 1726867234.06223: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867234.06227: Calling groups_plugins_play to load vars for managed_node2 15247 1726867234.06586: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000091 15247 1726867234.06589: WORKER PROCESS EXITING 15247 1726867234.06820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867234.07311: done with get_vars() 15247 1726867234.07323: done getting variables 15247 1726867234.07534: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 17:20:34 -0400 (0:00:00.978) 0:00:03.785 ****** 15247 1726867234.07684: entering _queue_task() for managed_node2/set_fact 15247 1726867234.07686: Creating lock for set_fact 15247 1726867234.08209: worker is 1 (out of 1 available) 15247 1726867234.08290: exiting _queue_task() for managed_node2/set_fact 15247 1726867234.08304: done queuing things up, now waiting for results queue to drain 15247 1726867234.08305: waiting for pending results... 15247 1726867234.09301: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 15247 1726867234.09307: in run() - task 0affcac9-a3a5-8ce3-1923-000000000092 15247 1726867234.09310: variable 'ansible_search_path' from source: unknown 15247 1726867234.09312: variable 'ansible_search_path' from source: unknown 15247 1726867234.09315: calling self._execute() 15247 1726867234.09668: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867234.09672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867234.09675: variable 'omit' from source: magic vars 15247 1726867234.10872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867234.11681: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867234.11733: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867234.11773: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867234.11888: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867234.12091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867234.12128: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867234.12157: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867234.12238: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867234.12509: Evaluated conditional (not __network_is_ostree is defined): True 15247 1726867234.12521: variable 'omit' from source: magic vars 15247 1726867234.12559: variable 'omit' from source: magic vars 15247 1726867234.12776: variable '__ostree_booted_stat' from source: set_fact 15247 1726867234.12838: variable 'omit' from source: magic vars 15247 1726867234.12922: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867234.12926: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867234.12946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867234.12969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867234.12986: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867234.13021: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867234.13139: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867234.13143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867234.13147: Set connection var ansible_shell_executable to /bin/sh 15247 1726867234.13155: Set connection var ansible_connection to ssh 15247 1726867234.13161: Set connection var ansible_shell_type to sh 15247 1726867234.13172: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867234.13188: Set connection var ansible_timeout to 10 15247 1726867234.13197: Set connection var ansible_pipelining to False 15247 1726867234.13252: variable 'ansible_shell_executable' from source: unknown 15247 1726867234.13261: variable 'ansible_connection' from source: unknown 15247 1726867234.13286: variable 'ansible_module_compression' from source: unknown 15247 1726867234.13289: variable 'ansible_shell_type' from source: unknown 15247 1726867234.13291: variable 'ansible_shell_executable' from source: unknown 15247 1726867234.13293: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867234.13297: variable 'ansible_pipelining' from source: unknown 15247 1726867234.13307: variable 'ansible_timeout' from source: unknown 15247 1726867234.13316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867234.13422: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867234.13437: variable 'omit' from source: magic vars 15247 1726867234.13447: starting attempt loop 15247 1726867234.13463: running the handler 15247 1726867234.13482: handler run complete 15247 1726867234.13496: attempt loop complete, returning result 15247 1726867234.13506: _execute() done 15247 1726867234.13515: dumping result to json 15247 1726867234.13523: done dumping result, returning 15247 1726867234.13535: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [0affcac9-a3a5-8ce3-1923-000000000092] 15247 1726867234.13574: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000092 ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 15247 1726867234.13694: no more pending results, returning what we have 15247 1726867234.13697: results queue empty 15247 1726867234.13697: checking for any_errors_fatal 15247 1726867234.13703: done checking for any_errors_fatal 15247 1726867234.13704: checking for max_fail_percentage 15247 1726867234.13706: done checking for max_fail_percentage 15247 1726867234.13707: checking to see if all hosts have failed and the running result is not ok 15247 1726867234.13708: done checking to see if all hosts have failed 15247 1726867234.13708: getting the remaining hosts for this loop 15247 1726867234.13710: done getting the remaining hosts for this loop 15247 1726867234.13714: getting the next task for host managed_node2 15247 1726867234.13723: done getting next task for host managed_node2 15247 1726867234.13725: ^ task is: TASK: Fix CentOS6 Base repo 15247 1726867234.13727: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867234.13731: getting variables 15247 1726867234.13732: in VariableManager get_vars() 15247 1726867234.13760: Calling all_inventory to load vars for managed_node2 15247 1726867234.13762: Calling groups_inventory to load vars for managed_node2 15247 1726867234.13765: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867234.13775: Calling all_plugins_play to load vars for managed_node2 15247 1726867234.13780: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867234.13784: Calling groups_plugins_play to load vars for managed_node2 15247 1726867234.14323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867234.14522: done with get_vars() 15247 1726867234.14644: done getting variables 15247 1726867234.14750: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000092 15247 1726867234.14753: WORKER PROCESS EXITING 15247 1726867234.14837: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 17:20:34 -0400 (0:00:00.074) 0:00:03.859 ****** 15247 1726867234.14974: entering _queue_task() for managed_node2/copy 15247 1726867234.15424: worker is 1 (out of 1 available) 15247 1726867234.15435: exiting _queue_task() for managed_node2/copy 15247 1726867234.15446: done queuing things up, now waiting for results queue to drain 15247 1726867234.15448: waiting for pending results... 15247 1726867234.15918: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 15247 1726867234.15998: in run() - task 0affcac9-a3a5-8ce3-1923-000000000094 15247 1726867234.16099: variable 'ansible_search_path' from source: unknown 15247 1726867234.16112: variable 'ansible_search_path' from source: unknown 15247 1726867234.16151: calling self._execute() 15247 1726867234.16219: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867234.16486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867234.16489: variable 'omit' from source: magic vars 15247 1726867234.17215: variable 'ansible_distribution' from source: facts 15247 1726867234.17255: Evaluated conditional (ansible_distribution == 'CentOS'): True 15247 1726867234.17587: variable 'ansible_distribution_major_version' from source: facts 15247 1726867234.17599: Evaluated conditional (ansible_distribution_major_version == '6'): False 15247 1726867234.17610: when evaluation is False, skipping this task 15247 1726867234.17617: _execute() done 15247 1726867234.17625: dumping result to json 15247 1726867234.17632: done dumping result, returning 15247 1726867234.17642: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [0affcac9-a3a5-8ce3-1923-000000000094] 15247 1726867234.17654: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000094 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 15247 1726867234.17845: no more pending results, returning what we have 15247 1726867234.17848: results queue empty 15247 1726867234.17849: checking for any_errors_fatal 15247 1726867234.17853: done checking for any_errors_fatal 15247 1726867234.17854: checking for max_fail_percentage 15247 1726867234.17855: done checking for max_fail_percentage 15247 1726867234.17856: checking to see if all hosts have failed and the running result is not ok 15247 1726867234.17857: done checking to see if all hosts have failed 15247 1726867234.17857: getting the remaining hosts for this loop 15247 1726867234.17859: done getting the remaining hosts for this loop 15247 1726867234.17861: getting the next task for host managed_node2 15247 1726867234.17868: done getting next task for host managed_node2 15247 1726867234.17870: ^ task is: TASK: Include the task 'enable_epel.yml' 15247 1726867234.17873: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867234.17876: getting variables 15247 1726867234.17879: in VariableManager get_vars() 15247 1726867234.17988: Calling all_inventory to load vars for managed_node2 15247 1726867234.17991: Calling groups_inventory to load vars for managed_node2 15247 1726867234.17995: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867234.18009: Calling all_plugins_play to load vars for managed_node2 15247 1726867234.18020: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867234.18024: Calling groups_plugins_play to load vars for managed_node2 15247 1726867234.18388: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000094 15247 1726867234.18390: WORKER PROCESS EXITING 15247 1726867234.18408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867234.18815: done with get_vars() 15247 1726867234.18823: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 17:20:34 -0400 (0:00:00.040) 0:00:03.899 ****** 15247 1726867234.19017: entering _queue_task() for managed_node2/include_tasks 15247 1726867234.19490: worker is 1 (out of 1 available) 15247 1726867234.19504: exiting _queue_task() for managed_node2/include_tasks 15247 1726867234.19518: done queuing things up, now waiting for results queue to drain 15247 1726867234.19519: waiting for pending results... 15247 1726867234.20026: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 15247 1726867234.20324: in run() - task 0affcac9-a3a5-8ce3-1923-000000000095 15247 1726867234.20376: variable 'ansible_search_path' from source: unknown 15247 1726867234.20485: variable 'ansible_search_path' from source: unknown 15247 1726867234.20489: calling self._execute() 15247 1726867234.20560: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867234.20573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867234.20595: variable 'omit' from source: magic vars 15247 1726867234.21636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867234.26197: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867234.26281: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867234.26383: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867234.26479: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867234.26571: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867234.26730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867234.26813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867234.26891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867234.26933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867234.27011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867234.27386: variable '__network_is_ostree' from source: set_fact 15247 1726867234.27390: Evaluated conditional (not __network_is_ostree | d(false)): True 15247 1726867234.27392: _execute() done 15247 1726867234.27394: dumping result to json 15247 1726867234.27397: done dumping result, returning 15247 1726867234.27399: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [0affcac9-a3a5-8ce3-1923-000000000095] 15247 1726867234.27402: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000095 15247 1726867234.27537: no more pending results, returning what we have 15247 1726867234.27543: in VariableManager get_vars() 15247 1726867234.27581: Calling all_inventory to load vars for managed_node2 15247 1726867234.27584: Calling groups_inventory to load vars for managed_node2 15247 1726867234.27588: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867234.27598: Calling all_plugins_play to load vars for managed_node2 15247 1726867234.27601: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867234.27603: Calling groups_plugins_play to load vars for managed_node2 15247 1726867234.27994: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000095 15247 1726867234.27997: WORKER PROCESS EXITING 15247 1726867234.28026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867234.28447: done with get_vars() 15247 1726867234.28456: variable 'ansible_search_path' from source: unknown 15247 1726867234.28457: variable 'ansible_search_path' from source: unknown 15247 1726867234.29002: we have included files to process 15247 1726867234.29003: generating all_blocks data 15247 1726867234.29005: done generating all_blocks data 15247 1726867234.29010: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15247 1726867234.29011: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15247 1726867234.29014: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 15247 1726867234.31131: done processing included file 15247 1726867234.31133: iterating over new_blocks loaded from include file 15247 1726867234.31135: in VariableManager get_vars() 15247 1726867234.31147: done with get_vars() 15247 1726867234.31149: filtering new block on tags 15247 1726867234.31292: done filtering new block on tags 15247 1726867234.31295: in VariableManager get_vars() 15247 1726867234.31308: done with get_vars() 15247 1726867234.31310: filtering new block on tags 15247 1726867234.31322: done filtering new block on tags 15247 1726867234.31324: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 15247 1726867234.31330: extending task lists for all hosts with included blocks 15247 1726867234.31760: done extending task lists 15247 1726867234.31761: done processing included files 15247 1726867234.31762: results queue empty 15247 1726867234.31763: checking for any_errors_fatal 15247 1726867234.31765: done checking for any_errors_fatal 15247 1726867234.31766: checking for max_fail_percentage 15247 1726867234.31767: done checking for max_fail_percentage 15247 1726867234.31768: checking to see if all hosts have failed and the running result is not ok 15247 1726867234.31768: done checking to see if all hosts have failed 15247 1726867234.31769: getting the remaining hosts for this loop 15247 1726867234.31770: done getting the remaining hosts for this loop 15247 1726867234.31772: getting the next task for host managed_node2 15247 1726867234.31776: done getting next task for host managed_node2 15247 1726867234.31780: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 15247 1726867234.31782: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867234.31784: getting variables 15247 1726867234.31785: in VariableManager get_vars() 15247 1726867234.31981: Calling all_inventory to load vars for managed_node2 15247 1726867234.31984: Calling groups_inventory to load vars for managed_node2 15247 1726867234.31987: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867234.31992: Calling all_plugins_play to load vars for managed_node2 15247 1726867234.31999: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867234.32002: Calling groups_plugins_play to load vars for managed_node2 15247 1726867234.32473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867234.32998: done with get_vars() 15247 1726867234.33012: done getting variables 15247 1726867234.33075: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 15247 1726867234.33539: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 17:20:34 -0400 (0:00:00.146) 0:00:04.046 ****** 15247 1726867234.33703: entering _queue_task() for managed_node2/command 15247 1726867234.33704: Creating lock for command 15247 1726867234.34664: worker is 1 (out of 1 available) 15247 1726867234.34676: exiting _queue_task() for managed_node2/command 15247 1726867234.34755: done queuing things up, now waiting for results queue to drain 15247 1726867234.34757: waiting for pending results... 15247 1726867234.35394: running TaskExecutor() for managed_node2/TASK: Create EPEL 10 15247 1726867234.35399: in run() - task 0affcac9-a3a5-8ce3-1923-0000000000af 15247 1726867234.35402: variable 'ansible_search_path' from source: unknown 15247 1726867234.35407: variable 'ansible_search_path' from source: unknown 15247 1726867234.35410: calling self._execute() 15247 1726867234.35655: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867234.35666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867234.35684: variable 'omit' from source: magic vars 15247 1726867234.36450: variable 'ansible_distribution' from source: facts 15247 1726867234.36468: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15247 1726867234.36711: variable 'ansible_distribution_major_version' from source: facts 15247 1726867234.36940: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15247 1726867234.36944: when evaluation is False, skipping this task 15247 1726867234.36947: _execute() done 15247 1726867234.36949: dumping result to json 15247 1726867234.36951: done dumping result, returning 15247 1726867234.36953: done running TaskExecutor() for managed_node2/TASK: Create EPEL 10 [0affcac9-a3a5-8ce3-1923-0000000000af] 15247 1726867234.36955: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000af 15247 1726867234.37033: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000af 15247 1726867234.37036: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15247 1726867234.37091: no more pending results, returning what we have 15247 1726867234.37094: results queue empty 15247 1726867234.37095: checking for any_errors_fatal 15247 1726867234.37096: done checking for any_errors_fatal 15247 1726867234.37097: checking for max_fail_percentage 15247 1726867234.37098: done checking for max_fail_percentage 15247 1726867234.37099: checking to see if all hosts have failed and the running result is not ok 15247 1726867234.37100: done checking to see if all hosts have failed 15247 1726867234.37101: getting the remaining hosts for this loop 15247 1726867234.37102: done getting the remaining hosts for this loop 15247 1726867234.37105: getting the next task for host managed_node2 15247 1726867234.37112: done getting next task for host managed_node2 15247 1726867234.37114: ^ task is: TASK: Install yum-utils package 15247 1726867234.37118: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867234.37122: getting variables 15247 1726867234.37123: in VariableManager get_vars() 15247 1726867234.37152: Calling all_inventory to load vars for managed_node2 15247 1726867234.37155: Calling groups_inventory to load vars for managed_node2 15247 1726867234.37159: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867234.37171: Calling all_plugins_play to load vars for managed_node2 15247 1726867234.37174: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867234.37178: Calling groups_plugins_play to load vars for managed_node2 15247 1726867234.37713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867234.38102: done with get_vars() 15247 1726867234.38111: done getting variables 15247 1726867234.38322: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 17:20:34 -0400 (0:00:00.046) 0:00:04.093 ****** 15247 1726867234.38349: entering _queue_task() for managed_node2/package 15247 1726867234.38351: Creating lock for package 15247 1726867234.38960: worker is 1 (out of 1 available) 15247 1726867234.38974: exiting _queue_task() for managed_node2/package 15247 1726867234.39287: done queuing things up, now waiting for results queue to drain 15247 1726867234.39289: waiting for pending results... 15247 1726867234.39644: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 15247 1726867234.39669: in run() - task 0affcac9-a3a5-8ce3-1923-0000000000b0 15247 1726867234.39783: variable 'ansible_search_path' from source: unknown 15247 1726867234.39786: variable 'ansible_search_path' from source: unknown 15247 1726867234.39963: calling self._execute() 15247 1726867234.39967: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867234.39970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867234.39973: variable 'omit' from source: magic vars 15247 1726867234.40766: variable 'ansible_distribution' from source: facts 15247 1726867234.40823: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15247 1726867234.41137: variable 'ansible_distribution_major_version' from source: facts 15247 1726867234.41141: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15247 1726867234.41144: when evaluation is False, skipping this task 15247 1726867234.41146: _execute() done 15247 1726867234.41150: dumping result to json 15247 1726867234.41153: done dumping result, returning 15247 1726867234.41155: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [0affcac9-a3a5-8ce3-1923-0000000000b0] 15247 1726867234.41270: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000b0 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15247 1726867234.41515: no more pending results, returning what we have 15247 1726867234.41519: results queue empty 15247 1726867234.41520: checking for any_errors_fatal 15247 1726867234.41527: done checking for any_errors_fatal 15247 1726867234.41528: checking for max_fail_percentage 15247 1726867234.41530: done checking for max_fail_percentage 15247 1726867234.41531: checking to see if all hosts have failed and the running result is not ok 15247 1726867234.41532: done checking to see if all hosts have failed 15247 1726867234.41532: getting the remaining hosts for this loop 15247 1726867234.41534: done getting the remaining hosts for this loop 15247 1726867234.41537: getting the next task for host managed_node2 15247 1726867234.41545: done getting next task for host managed_node2 15247 1726867234.41548: ^ task is: TASK: Enable EPEL 7 15247 1726867234.41552: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867234.41556: getting variables 15247 1726867234.41558: in VariableManager get_vars() 15247 1726867234.41698: Calling all_inventory to load vars for managed_node2 15247 1726867234.41702: Calling groups_inventory to load vars for managed_node2 15247 1726867234.41706: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867234.41719: Calling all_plugins_play to load vars for managed_node2 15247 1726867234.41724: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867234.41727: Calling groups_plugins_play to load vars for managed_node2 15247 1726867234.42073: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000b0 15247 1726867234.42076: WORKER PROCESS EXITING 15247 1726867234.42098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867234.42296: done with get_vars() 15247 1726867234.42305: done getting variables 15247 1726867234.42364: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 17:20:34 -0400 (0:00:00.040) 0:00:04.133 ****** 15247 1726867234.42394: entering _queue_task() for managed_node2/command 15247 1726867234.42634: worker is 1 (out of 1 available) 15247 1726867234.42761: exiting _queue_task() for managed_node2/command 15247 1726867234.42772: done queuing things up, now waiting for results queue to drain 15247 1726867234.42774: waiting for pending results... 15247 1726867234.42932: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 15247 1726867234.43051: in run() - task 0affcac9-a3a5-8ce3-1923-0000000000b1 15247 1726867234.43070: variable 'ansible_search_path' from source: unknown 15247 1726867234.43080: variable 'ansible_search_path' from source: unknown 15247 1726867234.43129: calling self._execute() 15247 1726867234.43208: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867234.43221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867234.43233: variable 'omit' from source: magic vars 15247 1726867234.43685: variable 'ansible_distribution' from source: facts 15247 1726867234.43702: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15247 1726867234.43913: variable 'ansible_distribution_major_version' from source: facts 15247 1726867234.43926: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15247 1726867234.43935: when evaluation is False, skipping this task 15247 1726867234.43942: _execute() done 15247 1726867234.43949: dumping result to json 15247 1726867234.43965: done dumping result, returning 15247 1726867234.43978: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [0affcac9-a3a5-8ce3-1923-0000000000b1] 15247 1726867234.43992: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000b1 15247 1726867234.44182: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000b1 15247 1726867234.44185: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15247 1726867234.44227: no more pending results, returning what we have 15247 1726867234.44230: results queue empty 15247 1726867234.44231: checking for any_errors_fatal 15247 1726867234.44237: done checking for any_errors_fatal 15247 1726867234.44238: checking for max_fail_percentage 15247 1726867234.44239: done checking for max_fail_percentage 15247 1726867234.44240: checking to see if all hosts have failed and the running result is not ok 15247 1726867234.44241: done checking to see if all hosts have failed 15247 1726867234.44241: getting the remaining hosts for this loop 15247 1726867234.44243: done getting the remaining hosts for this loop 15247 1726867234.44246: getting the next task for host managed_node2 15247 1726867234.44253: done getting next task for host managed_node2 15247 1726867234.44255: ^ task is: TASK: Enable EPEL 8 15247 1726867234.44259: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867234.44262: getting variables 15247 1726867234.44264: in VariableManager get_vars() 15247 1726867234.44402: Calling all_inventory to load vars for managed_node2 15247 1726867234.44405: Calling groups_inventory to load vars for managed_node2 15247 1726867234.44408: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867234.44418: Calling all_plugins_play to load vars for managed_node2 15247 1726867234.44421: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867234.44424: Calling groups_plugins_play to load vars for managed_node2 15247 1726867234.44686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867234.44891: done with get_vars() 15247 1726867234.44900: done getting variables 15247 1726867234.44964: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 17:20:34 -0400 (0:00:00.025) 0:00:04.159 ****** 15247 1726867234.44996: entering _queue_task() for managed_node2/command 15247 1726867234.45228: worker is 1 (out of 1 available) 15247 1726867234.45239: exiting _queue_task() for managed_node2/command 15247 1726867234.45252: done queuing things up, now waiting for results queue to drain 15247 1726867234.45254: waiting for pending results... 15247 1726867234.45506: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 15247 1726867234.45885: in run() - task 0affcac9-a3a5-8ce3-1923-0000000000b2 15247 1726867234.45889: variable 'ansible_search_path' from source: unknown 15247 1726867234.45892: variable 'ansible_search_path' from source: unknown 15247 1726867234.45896: calling self._execute() 15247 1726867234.45982: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867234.45995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867234.46010: variable 'omit' from source: magic vars 15247 1726867234.46413: variable 'ansible_distribution' from source: facts 15247 1726867234.46429: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15247 1726867234.46564: variable 'ansible_distribution_major_version' from source: facts 15247 1726867234.46588: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 15247 1726867234.46595: when evaluation is False, skipping this task 15247 1726867234.46601: _execute() done 15247 1726867234.46608: dumping result to json 15247 1726867234.46617: done dumping result, returning 15247 1726867234.46627: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [0affcac9-a3a5-8ce3-1923-0000000000b2] 15247 1726867234.46637: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000b2 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 15247 1726867234.46772: no more pending results, returning what we have 15247 1726867234.46775: results queue empty 15247 1726867234.46776: checking for any_errors_fatal 15247 1726867234.46782: done checking for any_errors_fatal 15247 1726867234.46783: checking for max_fail_percentage 15247 1726867234.46785: done checking for max_fail_percentage 15247 1726867234.46884: checking to see if all hosts have failed and the running result is not ok 15247 1726867234.46885: done checking to see if all hosts have failed 15247 1726867234.46886: getting the remaining hosts for this loop 15247 1726867234.46887: done getting the remaining hosts for this loop 15247 1726867234.46891: getting the next task for host managed_node2 15247 1726867234.46902: done getting next task for host managed_node2 15247 1726867234.46904: ^ task is: TASK: Enable EPEL 6 15247 1726867234.46908: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867234.46912: getting variables 15247 1726867234.46913: in VariableManager get_vars() 15247 1726867234.46944: Calling all_inventory to load vars for managed_node2 15247 1726867234.46947: Calling groups_inventory to load vars for managed_node2 15247 1726867234.46951: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867234.46963: Calling all_plugins_play to load vars for managed_node2 15247 1726867234.46966: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867234.46970: Calling groups_plugins_play to load vars for managed_node2 15247 1726867234.47110: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000b2 15247 1726867234.47114: WORKER PROCESS EXITING 15247 1726867234.47358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867234.47566: done with get_vars() 15247 1726867234.47573: done getting variables 15247 1726867234.47625: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 17:20:34 -0400 (0:00:00.026) 0:00:04.186 ****** 15247 1726867234.47657: entering _queue_task() for managed_node2/copy 15247 1726867234.48057: worker is 1 (out of 1 available) 15247 1726867234.48064: exiting _queue_task() for managed_node2/copy 15247 1726867234.48074: done queuing things up, now waiting for results queue to drain 15247 1726867234.48075: waiting for pending results... 15247 1726867234.48100: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 15247 1726867234.48206: in run() - task 0affcac9-a3a5-8ce3-1923-0000000000b4 15247 1726867234.48226: variable 'ansible_search_path' from source: unknown 15247 1726867234.48236: variable 'ansible_search_path' from source: unknown 15247 1726867234.48272: calling self._execute() 15247 1726867234.48352: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867234.48364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867234.48382: variable 'omit' from source: magic vars 15247 1726867234.48816: variable 'ansible_distribution' from source: facts 15247 1726867234.48834: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 15247 1726867234.48961: variable 'ansible_distribution_major_version' from source: facts 15247 1726867234.48972: Evaluated conditional (ansible_distribution_major_version == '6'): False 15247 1726867234.48982: when evaluation is False, skipping this task 15247 1726867234.48990: _execute() done 15247 1726867234.48997: dumping result to json 15247 1726867234.49004: done dumping result, returning 15247 1726867234.49015: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [0affcac9-a3a5-8ce3-1923-0000000000b4] 15247 1726867234.49026: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000b4 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 15247 1726867234.49190: no more pending results, returning what we have 15247 1726867234.49194: results queue empty 15247 1726867234.49194: checking for any_errors_fatal 15247 1726867234.49197: done checking for any_errors_fatal 15247 1726867234.49198: checking for max_fail_percentage 15247 1726867234.49199: done checking for max_fail_percentage 15247 1726867234.49200: checking to see if all hosts have failed and the running result is not ok 15247 1726867234.49201: done checking to see if all hosts have failed 15247 1726867234.49201: getting the remaining hosts for this loop 15247 1726867234.49203: done getting the remaining hosts for this loop 15247 1726867234.49206: getting the next task for host managed_node2 15247 1726867234.49215: done getting next task for host managed_node2 15247 1726867234.49217: ^ task is: TASK: Set network provider to 'nm' 15247 1726867234.49219: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867234.49222: getting variables 15247 1726867234.49224: in VariableManager get_vars() 15247 1726867234.49253: Calling all_inventory to load vars for managed_node2 15247 1726867234.49255: Calling groups_inventory to load vars for managed_node2 15247 1726867234.49258: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867234.49269: Calling all_plugins_play to load vars for managed_node2 15247 1726867234.49271: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867234.49275: Calling groups_plugins_play to load vars for managed_node2 15247 1726867234.49662: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000b4 15247 1726867234.49665: WORKER PROCESS EXITING 15247 1726867234.49688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867234.49885: done with get_vars() 15247 1726867234.49893: done getting variables 15247 1726867234.49953: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:13 Friday 20 September 2024 17:20:34 -0400 (0:00:00.023) 0:00:04.209 ****** 15247 1726867234.49979: entering _queue_task() for managed_node2/set_fact 15247 1726867234.50256: worker is 1 (out of 1 available) 15247 1726867234.50266: exiting _queue_task() for managed_node2/set_fact 15247 1726867234.50275: done queuing things up, now waiting for results queue to drain 15247 1726867234.50278: waiting for pending results... 15247 1726867234.50443: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 15247 1726867234.50536: in run() - task 0affcac9-a3a5-8ce3-1923-000000000007 15247 1726867234.50581: variable 'ansible_search_path' from source: unknown 15247 1726867234.50618: calling self._execute() 15247 1726867234.50701: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867234.50801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867234.50815: variable 'omit' from source: magic vars 15247 1726867234.51001: variable 'omit' from source: magic vars 15247 1726867234.51234: variable 'omit' from source: magic vars 15247 1726867234.51237: variable 'omit' from source: magic vars 15247 1726867234.51246: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867234.51452: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867234.51456: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867234.51458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867234.51460: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867234.51462: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867234.51465: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867234.51467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867234.51776: Set connection var ansible_shell_executable to /bin/sh 15247 1726867234.51793: Set connection var ansible_connection to ssh 15247 1726867234.51801: Set connection var ansible_shell_type to sh 15247 1726867234.51813: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867234.51825: Set connection var ansible_timeout to 10 15247 1726867234.51835: Set connection var ansible_pipelining to False 15247 1726867234.51914: variable 'ansible_shell_executable' from source: unknown 15247 1726867234.51924: variable 'ansible_connection' from source: unknown 15247 1726867234.51931: variable 'ansible_module_compression' from source: unknown 15247 1726867234.51939: variable 'ansible_shell_type' from source: unknown 15247 1726867234.51946: variable 'ansible_shell_executable' from source: unknown 15247 1726867234.52082: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867234.52085: variable 'ansible_pipelining' from source: unknown 15247 1726867234.52088: variable 'ansible_timeout' from source: unknown 15247 1726867234.52090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867234.52265: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867234.52284: variable 'omit' from source: magic vars 15247 1726867234.52295: starting attempt loop 15247 1726867234.52302: running the handler 15247 1726867234.52346: handler run complete 15247 1726867234.52363: attempt loop complete, returning result 15247 1726867234.52370: _execute() done 15247 1726867234.52379: dumping result to json 15247 1726867234.52388: done dumping result, returning 15247 1726867234.52399: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [0affcac9-a3a5-8ce3-1923-000000000007] 15247 1726867234.52409: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000007 ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 15247 1726867234.52587: no more pending results, returning what we have 15247 1726867234.52590: results queue empty 15247 1726867234.52591: checking for any_errors_fatal 15247 1726867234.52597: done checking for any_errors_fatal 15247 1726867234.52598: checking for max_fail_percentage 15247 1726867234.52600: done checking for max_fail_percentage 15247 1726867234.52601: checking to see if all hosts have failed and the running result is not ok 15247 1726867234.52602: done checking to see if all hosts have failed 15247 1726867234.52603: getting the remaining hosts for this loop 15247 1726867234.52604: done getting the remaining hosts for this loop 15247 1726867234.52608: getting the next task for host managed_node2 15247 1726867234.52616: done getting next task for host managed_node2 15247 1726867234.52619: ^ task is: TASK: meta (flush_handlers) 15247 1726867234.52620: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867234.52624: getting variables 15247 1726867234.52626: in VariableManager get_vars() 15247 1726867234.52655: Calling all_inventory to load vars for managed_node2 15247 1726867234.52658: Calling groups_inventory to load vars for managed_node2 15247 1726867234.52662: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867234.52672: Calling all_plugins_play to load vars for managed_node2 15247 1726867234.52676: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867234.52681: Calling groups_plugins_play to load vars for managed_node2 15247 1726867234.53030: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000007 15247 1726867234.53033: WORKER PROCESS EXITING 15247 1726867234.53054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867234.53420: done with get_vars() 15247 1726867234.53427: done getting variables 15247 1726867234.53488: in VariableManager get_vars() 15247 1726867234.53496: Calling all_inventory to load vars for managed_node2 15247 1726867234.53498: Calling groups_inventory to load vars for managed_node2 15247 1726867234.53500: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867234.53503: Calling all_plugins_play to load vars for managed_node2 15247 1726867234.53505: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867234.53508: Calling groups_plugins_play to load vars for managed_node2 15247 1726867234.53640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867234.53898: done with get_vars() 15247 1726867234.53918: done queuing things up, now waiting for results queue to drain 15247 1726867234.53920: results queue empty 15247 1726867234.53921: checking for any_errors_fatal 15247 1726867234.53922: done checking for any_errors_fatal 15247 1726867234.53923: checking for max_fail_percentage 15247 1726867234.53924: done checking for max_fail_percentage 15247 1726867234.53924: checking to see if all hosts have failed and the running result is not ok 15247 1726867234.53925: done checking to see if all hosts have failed 15247 1726867234.53926: getting the remaining hosts for this loop 15247 1726867234.53926: done getting the remaining hosts for this loop 15247 1726867234.53928: getting the next task for host managed_node2 15247 1726867234.53931: done getting next task for host managed_node2 15247 1726867234.53932: ^ task is: TASK: meta (flush_handlers) 15247 1726867234.53934: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867234.53940: getting variables 15247 1726867234.53940: in VariableManager get_vars() 15247 1726867234.53947: Calling all_inventory to load vars for managed_node2 15247 1726867234.53949: Calling groups_inventory to load vars for managed_node2 15247 1726867234.53951: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867234.53954: Calling all_plugins_play to load vars for managed_node2 15247 1726867234.53956: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867234.53958: Calling groups_plugins_play to load vars for managed_node2 15247 1726867234.54160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867234.54600: done with get_vars() 15247 1726867234.54607: done getting variables 15247 1726867234.54770: in VariableManager get_vars() 15247 1726867234.54779: Calling all_inventory to load vars for managed_node2 15247 1726867234.54781: Calling groups_inventory to load vars for managed_node2 15247 1726867234.54784: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867234.54787: Calling all_plugins_play to load vars for managed_node2 15247 1726867234.54790: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867234.54793: Calling groups_plugins_play to load vars for managed_node2 15247 1726867234.55038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867234.55448: done with get_vars() 15247 1726867234.55457: done queuing things up, now waiting for results queue to drain 15247 1726867234.55458: results queue empty 15247 1726867234.55459: checking for any_errors_fatal 15247 1726867234.55460: done checking for any_errors_fatal 15247 1726867234.55460: checking for max_fail_percentage 15247 1726867234.55461: done checking for max_fail_percentage 15247 1726867234.55462: checking to see if all hosts have failed and the running result is not ok 15247 1726867234.55462: done checking to see if all hosts have failed 15247 1726867234.55463: getting the remaining hosts for this loop 15247 1726867234.55464: done getting the remaining hosts for this loop 15247 1726867234.55466: getting the next task for host managed_node2 15247 1726867234.55468: done getting next task for host managed_node2 15247 1726867234.55468: ^ task is: None 15247 1726867234.55469: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867234.55470: done queuing things up, now waiting for results queue to drain 15247 1726867234.55471: results queue empty 15247 1726867234.55472: checking for any_errors_fatal 15247 1726867234.55472: done checking for any_errors_fatal 15247 1726867234.55473: checking for max_fail_percentage 15247 1726867234.55474: done checking for max_fail_percentage 15247 1726867234.55474: checking to see if all hosts have failed and the running result is not ok 15247 1726867234.55475: done checking to see if all hosts have failed 15247 1726867234.55476: getting the next task for host managed_node2 15247 1726867234.55532: done getting next task for host managed_node2 15247 1726867234.55533: ^ task is: None 15247 1726867234.55536: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867234.55583: in VariableManager get_vars() 15247 1726867234.55598: done with get_vars() 15247 1726867234.55603: in VariableManager get_vars() 15247 1726867234.55623: done with get_vars() 15247 1726867234.55629: variable 'omit' from source: magic vars 15247 1726867234.55658: in VariableManager get_vars() 15247 1726867234.55667: done with get_vars() 15247 1726867234.55686: variable 'omit' from source: magic vars PLAY [Test configuring bridges] ************************************************ 15247 1726867234.55969: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15247 1726867234.56084: getting the remaining hosts for this loop 15247 1726867234.56085: done getting the remaining hosts for this loop 15247 1726867234.56088: getting the next task for host managed_node2 15247 1726867234.56090: done getting next task for host managed_node2 15247 1726867234.56092: ^ task is: TASK: Gathering Facts 15247 1726867234.56093: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867234.56095: getting variables 15247 1726867234.56096: in VariableManager get_vars() 15247 1726867234.56103: Calling all_inventory to load vars for managed_node2 15247 1726867234.56105: Calling groups_inventory to load vars for managed_node2 15247 1726867234.56107: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867234.56111: Calling all_plugins_play to load vars for managed_node2 15247 1726867234.56124: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867234.56127: Calling groups_plugins_play to load vars for managed_node2 15247 1726867234.56414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867234.56590: done with get_vars() 15247 1726867234.56685: done getting variables 15247 1726867234.56783: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 Friday 20 September 2024 17:20:34 -0400 (0:00:00.068) 0:00:04.277 ****** 15247 1726867234.56804: entering _queue_task() for managed_node2/gather_facts 15247 1726867234.57303: worker is 1 (out of 1 available) 15247 1726867234.57315: exiting _queue_task() for managed_node2/gather_facts 15247 1726867234.57326: done queuing things up, now waiting for results queue to drain 15247 1726867234.57327: waiting for pending results... 15247 1726867234.57607: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15247 1726867234.57662: in run() - task 0affcac9-a3a5-8ce3-1923-0000000000da 15247 1726867234.57685: variable 'ansible_search_path' from source: unknown 15247 1726867234.57733: calling self._execute() 15247 1726867234.57820: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867234.57833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867234.57848: variable 'omit' from source: magic vars 15247 1726867234.58232: variable 'ansible_distribution_major_version' from source: facts 15247 1726867234.58257: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867234.58267: variable 'omit' from source: magic vars 15247 1726867234.58301: variable 'omit' from source: magic vars 15247 1726867234.58360: variable 'omit' from source: magic vars 15247 1726867234.58393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867234.58434: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867234.58483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867234.58494: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867234.58580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867234.58583: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867234.58586: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867234.58588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867234.58666: Set connection var ansible_shell_executable to /bin/sh 15247 1726867234.58675: Set connection var ansible_connection to ssh 15247 1726867234.58694: Set connection var ansible_shell_type to sh 15247 1726867234.58706: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867234.58797: Set connection var ansible_timeout to 10 15247 1726867234.58801: Set connection var ansible_pipelining to False 15247 1726867234.58803: variable 'ansible_shell_executable' from source: unknown 15247 1726867234.58805: variable 'ansible_connection' from source: unknown 15247 1726867234.58807: variable 'ansible_module_compression' from source: unknown 15247 1726867234.58809: variable 'ansible_shell_type' from source: unknown 15247 1726867234.58811: variable 'ansible_shell_executable' from source: unknown 15247 1726867234.58814: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867234.58816: variable 'ansible_pipelining' from source: unknown 15247 1726867234.58818: variable 'ansible_timeout' from source: unknown 15247 1726867234.58820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867234.58989: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867234.59042: variable 'omit' from source: magic vars 15247 1726867234.59057: starting attempt loop 15247 1726867234.59065: running the handler 15247 1726867234.59088: variable 'ansible_facts' from source: unknown 15247 1726867234.59110: _low_level_execute_command(): starting 15247 1726867234.59128: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867234.59844: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867234.59856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867234.59897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867234.59995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15247 1726867234.60017: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867234.60056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867234.60138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867234.62483: stdout chunk (state=3): >>>/root <<< 15247 1726867234.62685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867234.62698: stdout chunk (state=3): >>><<< 15247 1726867234.62713: stderr chunk (state=3): >>><<< 15247 1726867234.62849: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867234.62853: _low_level_execute_command(): starting 15247 1726867234.62856: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867234.6274915-15496-213468839555440 `" && echo ansible-tmp-1726867234.6274915-15496-213468839555440="` echo /root/.ansible/tmp/ansible-tmp-1726867234.6274915-15496-213468839555440 `" ) && sleep 0' 15247 1726867234.63367: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867234.63385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867234.63402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867234.63425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867234.63443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867234.63498: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867234.63565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867234.63585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867234.63613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867234.63690: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867234.66412: stdout chunk (state=3): >>>ansible-tmp-1726867234.6274915-15496-213468839555440=/root/.ansible/tmp/ansible-tmp-1726867234.6274915-15496-213468839555440 <<< 15247 1726867234.66622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867234.66645: stdout chunk (state=3): >>><<< 15247 1726867234.66648: stderr chunk (state=3): >>><<< 15247 1726867234.66784: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867234.6274915-15496-213468839555440=/root/.ansible/tmp/ansible-tmp-1726867234.6274915-15496-213468839555440 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867234.66788: variable 'ansible_module_compression' from source: unknown 15247 1726867234.66790: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15247 1726867234.66834: variable 'ansible_facts' from source: unknown 15247 1726867234.67060: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867234.6274915-15496-213468839555440/AnsiballZ_setup.py 15247 1726867234.67264: Sending initial data 15247 1726867234.67267: Sent initial data (154 bytes) 15247 1726867234.67862: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867234.67899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867234.67996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867234.68017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867234.68031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867234.68117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867234.70363: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867234.70440: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867234.70499: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpgd_polcr /root/.ansible/tmp/ansible-tmp-1726867234.6274915-15496-213468839555440/AnsiballZ_setup.py <<< 15247 1726867234.70508: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867234.6274915-15496-213468839555440/AnsiballZ_setup.py" <<< 15247 1726867234.70553: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpgd_polcr" to remote "/root/.ansible/tmp/ansible-tmp-1726867234.6274915-15496-213468839555440/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867234.6274915-15496-213468839555440/AnsiballZ_setup.py" <<< 15247 1726867234.72217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867234.72221: stderr chunk (state=3): >>><<< 15247 1726867234.72223: stdout chunk (state=3): >>><<< 15247 1726867234.72226: done transferring module to remote 15247 1726867234.72228: _low_level_execute_command(): starting 15247 1726867234.72230: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867234.6274915-15496-213468839555440/ /root/.ansible/tmp/ansible-tmp-1726867234.6274915-15496-213468839555440/AnsiballZ_setup.py && sleep 0' 15247 1726867234.72882: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867234.72936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867234.72958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867234.73003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867234.73119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867234.75766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867234.75790: stderr chunk (state=3): >>><<< 15247 1726867234.75798: stdout chunk (state=3): >>><<< 15247 1726867234.75824: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867234.75906: _low_level_execute_command(): starting 15247 1726867234.75911: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867234.6274915-15496-213468839555440/AnsiballZ_setup.py && sleep 0' 15247 1726867234.76492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867234.76514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867234.76530: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867234.76700: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867234.76767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867235.61345: stdout chunk (state=3): >>> <<< 15247 1726867235.61404: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.63427734375, "5m": 0.384765625, "15m": 0.1875}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "20", "second": "35", "epoch": "1726867235", "epoch_int": "1726867235", "date": "2024-09-20", "time": "17:20:35",<<< 15247 1726867235.61448: stdout chunk (state=3): >>> "iso8601_micro": "2024-09-20T21:20:35.196515Z", "iso8601": "2024-09-20T21:20:35Z", "iso8601_basic": "20240920T172035196515", "iso8601_basic_short": "20240920T172035", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2948, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 583, "free": 2948}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 473, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796655104, "block_size": 4096, "block_total": 65519099, "block_available": 63915199, "block_used": 1603900, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_lsb": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15247 1726867235.64128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867235.64138: stdout chunk (state=3): >>><<< 15247 1726867235.64152: stderr chunk (state=3): >>><<< 15247 1726867235.64200: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.63427734375, "5m": 0.384765625, "15m": 0.1875}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "20", "second": "35", "epoch": "1726867235", "epoch_int": "1726867235", "date": "2024-09-20", "time": "17:20:35", "iso8601_micro": "2024-09-20T21:20:35.196515Z", "iso8601": "2024-09-20T21:20:35Z", "iso8601_basic": "20240920T172035196515", "iso8601_basic_short": "20240920T172035", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2948, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 583, "free": 2948}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 473, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796655104, "block_size": 4096, "block_total": 65519099, "block_available": 63915199, "block_used": 1603900, "inode_total": 131070960, "inode_available": 131029047, "inode_used": 41913, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_lsb": {}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867235.64786: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867234.6274915-15496-213468839555440/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867235.64790: _low_level_execute_command(): starting 15247 1726867235.64792: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867234.6274915-15496-213468839555440/ > /dev/null 2>&1 && sleep 0' 15247 1726867235.65857: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867235.66062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867235.66148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867235.69183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867235.69186: stdout chunk (state=3): >>><<< 15247 1726867235.69188: stderr chunk (state=3): >>><<< 15247 1726867235.69191: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867235.69193: handler run complete 15247 1726867235.69582: variable 'ansible_facts' from source: unknown 15247 1726867235.69585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867235.70186: variable 'ansible_facts' from source: unknown 15247 1726867235.70267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867235.70605: attempt loop complete, returning result 15247 1726867235.70614: _execute() done 15247 1726867235.70620: dumping result to json 15247 1726867235.70652: done dumping result, returning 15247 1726867235.70662: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-8ce3-1923-0000000000da] 15247 1726867235.70673: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000da 15247 1726867235.71384: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000da 15247 1726867235.71388: WORKER PROCESS EXITING ok: [managed_node2] 15247 1726867235.71856: no more pending results, returning what we have 15247 1726867235.71859: results queue empty 15247 1726867235.71860: checking for any_errors_fatal 15247 1726867235.71861: done checking for any_errors_fatal 15247 1726867235.71862: checking for max_fail_percentage 15247 1726867235.71864: done checking for max_fail_percentage 15247 1726867235.71864: checking to see if all hosts have failed and the running result is not ok 15247 1726867235.71865: done checking to see if all hosts have failed 15247 1726867235.71866: getting the remaining hosts for this loop 15247 1726867235.71867: done getting the remaining hosts for this loop 15247 1726867235.71871: getting the next task for host managed_node2 15247 1726867235.71878: done getting next task for host managed_node2 15247 1726867235.71880: ^ task is: TASK: meta (flush_handlers) 15247 1726867235.71882: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867235.71886: getting variables 15247 1726867235.71887: in VariableManager get_vars() 15247 1726867235.71912: Calling all_inventory to load vars for managed_node2 15247 1726867235.71915: Calling groups_inventory to load vars for managed_node2 15247 1726867235.71918: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867235.71929: Calling all_plugins_play to load vars for managed_node2 15247 1726867235.71932: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867235.71935: Calling groups_plugins_play to load vars for managed_node2 15247 1726867235.72566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867235.73216: done with get_vars() 15247 1726867235.73227: done getting variables 15247 1726867235.73498: in VariableManager get_vars() 15247 1726867235.73507: Calling all_inventory to load vars for managed_node2 15247 1726867235.73509: Calling groups_inventory to load vars for managed_node2 15247 1726867235.73511: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867235.73515: Calling all_plugins_play to load vars for managed_node2 15247 1726867235.73517: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867235.73519: Calling groups_plugins_play to load vars for managed_node2 15247 1726867235.74009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867235.74230: done with get_vars() 15247 1726867235.74244: done queuing things up, now waiting for results queue to drain 15247 1726867235.74246: results queue empty 15247 1726867235.74247: checking for any_errors_fatal 15247 1726867235.74250: done checking for any_errors_fatal 15247 1726867235.74251: checking for max_fail_percentage 15247 1726867235.74252: done checking for max_fail_percentage 15247 1726867235.74253: checking to see if all hosts have failed and the running result is not ok 15247 1726867235.74253: done checking to see if all hosts have failed 15247 1726867235.74254: getting the remaining hosts for this loop 15247 1726867235.74259: done getting the remaining hosts for this loop 15247 1726867235.74262: getting the next task for host managed_node2 15247 1726867235.74266: done getting next task for host managed_node2 15247 1726867235.74268: ^ task is: TASK: Set interface={{ interface }} 15247 1726867235.74270: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867235.74272: getting variables 15247 1726867235.74273: in VariableManager get_vars() 15247 1726867235.74514: Calling all_inventory to load vars for managed_node2 15247 1726867235.74516: Calling groups_inventory to load vars for managed_node2 15247 1726867235.74518: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867235.74523: Calling all_plugins_play to load vars for managed_node2 15247 1726867235.74525: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867235.74528: Calling groups_plugins_play to load vars for managed_node2 15247 1726867235.74771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867235.75050: done with get_vars() 15247 1726867235.75058: done getting variables 15247 1726867235.75308: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15247 1726867235.75431: variable 'interface' from source: play vars TASK [Set interface=LSR-TST-br31] ********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:9 Friday 20 September 2024 17:20:35 -0400 (0:00:01.186) 0:00:05.464 ****** 15247 1726867235.75471: entering _queue_task() for managed_node2/set_fact 15247 1726867235.76157: worker is 1 (out of 1 available) 15247 1726867235.76169: exiting _queue_task() for managed_node2/set_fact 15247 1726867235.76183: done queuing things up, now waiting for results queue to drain 15247 1726867235.76185: waiting for pending results... 15247 1726867235.76647: running TaskExecutor() for managed_node2/TASK: Set interface=LSR-TST-br31 15247 1726867235.76799: in run() - task 0affcac9-a3a5-8ce3-1923-00000000000b 15247 1726867235.76815: variable 'ansible_search_path' from source: unknown 15247 1726867235.76962: calling self._execute() 15247 1726867235.77282: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867235.77286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867235.77288: variable 'omit' from source: magic vars 15247 1726867235.77858: variable 'ansible_distribution_major_version' from source: facts 15247 1726867235.77869: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867235.77876: variable 'omit' from source: magic vars 15247 1726867235.77902: variable 'omit' from source: magic vars 15247 1726867235.77990: variable 'interface' from source: play vars 15247 1726867235.78123: variable 'interface' from source: play vars 15247 1726867235.78196: variable 'omit' from source: magic vars 15247 1726867235.78349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867235.78385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867235.78412: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867235.78429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867235.78440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867235.78586: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867235.78589: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867235.78592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867235.78798: Set connection var ansible_shell_executable to /bin/sh 15247 1726867235.78801: Set connection var ansible_connection to ssh 15247 1726867235.78804: Set connection var ansible_shell_type to sh 15247 1726867235.78814: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867235.78820: Set connection var ansible_timeout to 10 15247 1726867235.78825: Set connection var ansible_pipelining to False 15247 1726867235.78930: variable 'ansible_shell_executable' from source: unknown 15247 1726867235.78934: variable 'ansible_connection' from source: unknown 15247 1726867235.78989: variable 'ansible_module_compression' from source: unknown 15247 1726867235.78993: variable 'ansible_shell_type' from source: unknown 15247 1726867235.78995: variable 'ansible_shell_executable' from source: unknown 15247 1726867235.78997: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867235.79002: variable 'ansible_pipelining' from source: unknown 15247 1726867235.79005: variable 'ansible_timeout' from source: unknown 15247 1726867235.79012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867235.79322: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867235.79332: variable 'omit' from source: magic vars 15247 1726867235.79337: starting attempt loop 15247 1726867235.79340: running the handler 15247 1726867235.79352: handler run complete 15247 1726867235.79364: attempt loop complete, returning result 15247 1726867235.79366: _execute() done 15247 1726867235.79369: dumping result to json 15247 1726867235.79583: done dumping result, returning 15247 1726867235.79587: done running TaskExecutor() for managed_node2/TASK: Set interface=LSR-TST-br31 [0affcac9-a3a5-8ce3-1923-00000000000b] 15247 1726867235.79589: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000000b 15247 1726867235.79653: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000000b 15247 1726867235.79657: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "interface": "LSR-TST-br31" }, "changed": false } 15247 1726867235.79737: no more pending results, returning what we have 15247 1726867235.79740: results queue empty 15247 1726867235.79741: checking for any_errors_fatal 15247 1726867235.79744: done checking for any_errors_fatal 15247 1726867235.79744: checking for max_fail_percentage 15247 1726867235.79746: done checking for max_fail_percentage 15247 1726867235.79746: checking to see if all hosts have failed and the running result is not ok 15247 1726867235.79748: done checking to see if all hosts have failed 15247 1726867235.79748: getting the remaining hosts for this loop 15247 1726867235.79749: done getting the remaining hosts for this loop 15247 1726867235.79753: getting the next task for host managed_node2 15247 1726867235.79759: done getting next task for host managed_node2 15247 1726867235.79762: ^ task is: TASK: Include the task 'show_interfaces.yml' 15247 1726867235.79764: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867235.79768: getting variables 15247 1726867235.79770: in VariableManager get_vars() 15247 1726867235.79802: Calling all_inventory to load vars for managed_node2 15247 1726867235.79805: Calling groups_inventory to load vars for managed_node2 15247 1726867235.79809: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867235.79821: Calling all_plugins_play to load vars for managed_node2 15247 1726867235.79825: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867235.79828: Calling groups_plugins_play to load vars for managed_node2 15247 1726867235.80327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867235.80749: done with get_vars() 15247 1726867235.80757: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:12 Friday 20 September 2024 17:20:35 -0400 (0:00:00.055) 0:00:05.520 ****** 15247 1726867235.81044: entering _queue_task() for managed_node2/include_tasks 15247 1726867235.81703: worker is 1 (out of 1 available) 15247 1726867235.81713: exiting _queue_task() for managed_node2/include_tasks 15247 1726867235.81723: done queuing things up, now waiting for results queue to drain 15247 1726867235.81725: waiting for pending results... 15247 1726867235.81990: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 15247 1726867235.82385: in run() - task 0affcac9-a3a5-8ce3-1923-00000000000c 15247 1726867235.82389: variable 'ansible_search_path' from source: unknown 15247 1726867235.82391: calling self._execute() 15247 1726867235.82748: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867235.82754: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867235.82764: variable 'omit' from source: magic vars 15247 1726867235.83580: variable 'ansible_distribution_major_version' from source: facts 15247 1726867235.83636: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867235.83646: _execute() done 15247 1726867235.83671: dumping result to json 15247 1726867235.83680: done dumping result, returning 15247 1726867235.83690: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [0affcac9-a3a5-8ce3-1923-00000000000c] 15247 1726867235.83841: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000000c 15247 1726867235.83941: no more pending results, returning what we have 15247 1726867235.83946: in VariableManager get_vars() 15247 1726867235.83981: Calling all_inventory to load vars for managed_node2 15247 1726867235.83984: Calling groups_inventory to load vars for managed_node2 15247 1726867235.83987: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867235.84003: Calling all_plugins_play to load vars for managed_node2 15247 1726867235.84006: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867235.84010: Calling groups_plugins_play to load vars for managed_node2 15247 1726867235.84483: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000000c 15247 1726867235.84488: WORKER PROCESS EXITING 15247 1726867235.84693: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867235.85287: done with get_vars() 15247 1726867235.85295: variable 'ansible_search_path' from source: unknown 15247 1726867235.85309: we have included files to process 15247 1726867235.85310: generating all_blocks data 15247 1726867235.85312: done generating all_blocks data 15247 1726867235.85313: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15247 1726867235.85314: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15247 1726867235.85317: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 15247 1726867235.85463: in VariableManager get_vars() 15247 1726867235.85883: done with get_vars() 15247 1726867235.85983: done processing included file 15247 1726867235.85984: iterating over new_blocks loaded from include file 15247 1726867235.85986: in VariableManager get_vars() 15247 1726867235.85996: done with get_vars() 15247 1726867235.85997: filtering new block on tags 15247 1726867235.86012: done filtering new block on tags 15247 1726867235.86014: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 15247 1726867235.86019: extending task lists for all hosts with included blocks 15247 1726867235.86075: done extending task lists 15247 1726867235.86076: done processing included files 15247 1726867235.86480: results queue empty 15247 1726867235.86481: checking for any_errors_fatal 15247 1726867235.86486: done checking for any_errors_fatal 15247 1726867235.86487: checking for max_fail_percentage 15247 1726867235.86488: done checking for max_fail_percentage 15247 1726867235.86489: checking to see if all hosts have failed and the running result is not ok 15247 1726867235.86489: done checking to see if all hosts have failed 15247 1726867235.86490: getting the remaining hosts for this loop 15247 1726867235.86491: done getting the remaining hosts for this loop 15247 1726867235.86494: getting the next task for host managed_node2 15247 1726867235.86498: done getting next task for host managed_node2 15247 1726867235.86500: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 15247 1726867235.86502: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867235.86504: getting variables 15247 1726867235.86505: in VariableManager get_vars() 15247 1726867235.86513: Calling all_inventory to load vars for managed_node2 15247 1726867235.86515: Calling groups_inventory to load vars for managed_node2 15247 1726867235.86517: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867235.86523: Calling all_plugins_play to load vars for managed_node2 15247 1726867235.86525: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867235.86528: Calling groups_plugins_play to load vars for managed_node2 15247 1726867235.87096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867235.87270: done with get_vars() 15247 1726867235.87681: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 17:20:35 -0400 (0:00:00.067) 0:00:05.587 ****** 15247 1726867235.87754: entering _queue_task() for managed_node2/include_tasks 15247 1726867235.88429: worker is 1 (out of 1 available) 15247 1726867235.88440: exiting _queue_task() for managed_node2/include_tasks 15247 1726867235.88452: done queuing things up, now waiting for results queue to drain 15247 1726867235.88454: waiting for pending results... 15247 1726867235.89000: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 15247 1726867235.89248: in run() - task 0affcac9-a3a5-8ce3-1923-0000000000ee 15247 1726867235.89270: variable 'ansible_search_path' from source: unknown 15247 1726867235.89281: variable 'ansible_search_path' from source: unknown 15247 1726867235.89330: calling self._execute() 15247 1726867235.89462: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867235.89543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867235.89562: variable 'omit' from source: magic vars 15247 1726867235.90400: variable 'ansible_distribution_major_version' from source: facts 15247 1726867235.90434: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867235.90475: _execute() done 15247 1726867235.90488: dumping result to json 15247 1726867235.90498: done dumping result, returning 15247 1726867235.90620: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [0affcac9-a3a5-8ce3-1923-0000000000ee] 15247 1726867235.90623: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000ee 15247 1726867235.90820: no more pending results, returning what we have 15247 1726867235.90826: in VariableManager get_vars() 15247 1726867235.90862: Calling all_inventory to load vars for managed_node2 15247 1726867235.90865: Calling groups_inventory to load vars for managed_node2 15247 1726867235.90869: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867235.90886: Calling all_plugins_play to load vars for managed_node2 15247 1726867235.90889: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867235.90892: Calling groups_plugins_play to load vars for managed_node2 15247 1726867235.91535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867235.91951: done with get_vars() 15247 1726867235.91959: variable 'ansible_search_path' from source: unknown 15247 1726867235.91960: variable 'ansible_search_path' from source: unknown 15247 1726867235.92147: we have included files to process 15247 1726867235.92149: generating all_blocks data 15247 1726867235.92151: done generating all_blocks data 15247 1726867235.92152: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15247 1726867235.92153: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15247 1726867235.92156: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 15247 1726867235.92566: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000ee 15247 1726867235.92569: WORKER PROCESS EXITING 15247 1726867235.92868: done processing included file 15247 1726867235.92870: iterating over new_blocks loaded from include file 15247 1726867235.92872: in VariableManager get_vars() 15247 1726867235.92920: done with get_vars() 15247 1726867235.92923: filtering new block on tags 15247 1726867235.92941: done filtering new block on tags 15247 1726867235.92943: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 15247 1726867235.92949: extending task lists for all hosts with included blocks 15247 1726867235.93165: done extending task lists 15247 1726867235.93166: done processing included files 15247 1726867235.93167: results queue empty 15247 1726867235.93168: checking for any_errors_fatal 15247 1726867235.93170: done checking for any_errors_fatal 15247 1726867235.93171: checking for max_fail_percentage 15247 1726867235.93172: done checking for max_fail_percentage 15247 1726867235.93173: checking to see if all hosts have failed and the running result is not ok 15247 1726867235.93174: done checking to see if all hosts have failed 15247 1726867235.93174: getting the remaining hosts for this loop 15247 1726867235.93175: done getting the remaining hosts for this loop 15247 1726867235.93281: getting the next task for host managed_node2 15247 1726867235.93310: done getting next task for host managed_node2 15247 1726867235.93313: ^ task is: TASK: Gather current interface info 15247 1726867235.93316: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867235.93319: getting variables 15247 1726867235.93320: in VariableManager get_vars() 15247 1726867235.93328: Calling all_inventory to load vars for managed_node2 15247 1726867235.93330: Calling groups_inventory to load vars for managed_node2 15247 1726867235.93332: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867235.93338: Calling all_plugins_play to load vars for managed_node2 15247 1726867235.93340: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867235.93343: Calling groups_plugins_play to load vars for managed_node2 15247 1726867235.93641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867235.94124: done with get_vars() 15247 1726867235.94132: done getting variables 15247 1726867235.94165: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 17:20:35 -0400 (0:00:00.064) 0:00:05.651 ****** 15247 1726867235.94192: entering _queue_task() for managed_node2/command 15247 1726867235.94900: worker is 1 (out of 1 available) 15247 1726867235.94911: exiting _queue_task() for managed_node2/command 15247 1726867235.94923: done queuing things up, now waiting for results queue to drain 15247 1726867235.94924: waiting for pending results... 15247 1726867235.95155: running TaskExecutor() for managed_node2/TASK: Gather current interface info 15247 1726867235.95267: in run() - task 0affcac9-a3a5-8ce3-1923-0000000000fd 15247 1726867235.95300: variable 'ansible_search_path' from source: unknown 15247 1726867235.95309: variable 'ansible_search_path' from source: unknown 15247 1726867235.95348: calling self._execute() 15247 1726867235.95432: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867235.95444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867235.95461: variable 'omit' from source: magic vars 15247 1726867235.95839: variable 'ansible_distribution_major_version' from source: facts 15247 1726867235.95856: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867235.95867: variable 'omit' from source: magic vars 15247 1726867235.95915: variable 'omit' from source: magic vars 15247 1726867235.95965: variable 'omit' from source: magic vars 15247 1726867235.96008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867235.96056: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867235.96081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867235.96161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867235.96164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867235.96166: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867235.96168: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867235.96170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867235.96300: Set connection var ansible_shell_executable to /bin/sh 15247 1726867235.96308: Set connection var ansible_connection to ssh 15247 1726867235.96315: Set connection var ansible_shell_type to sh 15247 1726867235.96325: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867235.96338: Set connection var ansible_timeout to 10 15247 1726867235.96347: Set connection var ansible_pipelining to False 15247 1726867235.96415: variable 'ansible_shell_executable' from source: unknown 15247 1726867235.96423: variable 'ansible_connection' from source: unknown 15247 1726867235.96430: variable 'ansible_module_compression' from source: unknown 15247 1726867235.96489: variable 'ansible_shell_type' from source: unknown 15247 1726867235.96492: variable 'ansible_shell_executable' from source: unknown 15247 1726867235.96494: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867235.96496: variable 'ansible_pipelining' from source: unknown 15247 1726867235.96498: variable 'ansible_timeout' from source: unknown 15247 1726867235.96500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867235.96615: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867235.96631: variable 'omit' from source: magic vars 15247 1726867235.96642: starting attempt loop 15247 1726867235.96649: running the handler 15247 1726867235.96668: _low_level_execute_command(): starting 15247 1726867235.96684: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867235.97573: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867235.97713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867235.97759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 15247 1726867235.99987: stdout chunk (state=3): >>>/root <<< 15247 1726867236.00060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867236.00321: stdout chunk (state=3): >>><<< 15247 1726867236.00324: stderr chunk (state=3): >>><<< 15247 1726867236.00328: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 15247 1726867236.00330: _low_level_execute_command(): starting 15247 1726867236.00336: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867236.0022216-15579-23851170836093 `" && echo ansible-tmp-1726867236.0022216-15579-23851170836093="` echo /root/.ansible/tmp/ansible-tmp-1726867236.0022216-15579-23851170836093 `" ) && sleep 0' 15247 1726867236.00881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867236.00897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867236.01000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867236.01033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867236.01055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867236.01083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867236.01298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867236.03053: stdout chunk (state=3): >>>ansible-tmp-1726867236.0022216-15579-23851170836093=/root/.ansible/tmp/ansible-tmp-1726867236.0022216-15579-23851170836093 <<< 15247 1726867236.03205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867236.03209: stdout chunk (state=3): >>><<< 15247 1726867236.03212: stderr chunk (state=3): >>><<< 15247 1726867236.03229: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867236.0022216-15579-23851170836093=/root/.ansible/tmp/ansible-tmp-1726867236.0022216-15579-23851170836093 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867236.03268: variable 'ansible_module_compression' from source: unknown 15247 1726867236.03496: ANSIBALLZ: Using generic lock for ansible.legacy.command 15247 1726867236.03499: ANSIBALLZ: Acquiring lock 15247 1726867236.03501: ANSIBALLZ: Lock acquired: 140393880930304 15247 1726867236.03503: ANSIBALLZ: Creating module 15247 1726867236.21810: ANSIBALLZ: Writing module into payload 15247 1726867236.21916: ANSIBALLZ: Writing module 15247 1726867236.21946: ANSIBALLZ: Renaming module 15247 1726867236.21956: ANSIBALLZ: Done creating module 15247 1726867236.21980: variable 'ansible_facts' from source: unknown 15247 1726867236.22066: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867236.0022216-15579-23851170836093/AnsiballZ_command.py 15247 1726867236.22275: Sending initial data 15247 1726867236.22280: Sent initial data (155 bytes) 15247 1726867236.22901: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867236.22929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867236.22932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867236.22934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867236.22998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867236.24654: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867236.24689: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867236.24741: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmprl1bxwrs /root/.ansible/tmp/ansible-tmp-1726867236.0022216-15579-23851170836093/AnsiballZ_command.py <<< 15247 1726867236.24759: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867236.0022216-15579-23851170836093/AnsiballZ_command.py" <<< 15247 1726867236.24799: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmprl1bxwrs" to remote "/root/.ansible/tmp/ansible-tmp-1726867236.0022216-15579-23851170836093/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867236.0022216-15579-23851170836093/AnsiballZ_command.py" <<< 15247 1726867236.25799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867236.25887: stderr chunk (state=3): >>><<< 15247 1726867236.25890: stdout chunk (state=3): >>><<< 15247 1726867236.26387: done transferring module to remote 15247 1726867236.26390: _low_level_execute_command(): starting 15247 1726867236.26393: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867236.0022216-15579-23851170836093/ /root/.ansible/tmp/ansible-tmp-1726867236.0022216-15579-23851170836093/AnsiballZ_command.py && sleep 0' 15247 1726867236.26813: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867236.26816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867236.26818: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867236.26820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867236.26822: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867236.26824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867236.26826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867236.26955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867236.27108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867236.29101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867236.29105: stdout chunk (state=3): >>><<< 15247 1726867236.29107: stderr chunk (state=3): >>><<< 15247 1726867236.29110: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867236.29112: _low_level_execute_command(): starting 15247 1726867236.29114: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867236.0022216-15579-23851170836093/AnsiballZ_command.py && sleep 0' 15247 1726867236.29656: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867236.29668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867236.29681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867236.29697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867236.29795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867236.29820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867236.29900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867236.45609: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:20:36.451062", "end": "2024-09-20 17:20:36.454353", "delta": "0:00:00.003291", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15247 1726867236.47562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867236.47567: stdout chunk (state=3): >>><<< 15247 1726867236.47569: stderr chunk (state=3): >>><<< 15247 1726867236.47572: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 17:20:36.451062", "end": "2024-09-20 17:20:36.454353", "delta": "0:00:00.003291", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867236.47575: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867236.0022216-15579-23851170836093/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867236.47580: _low_level_execute_command(): starting 15247 1726867236.47582: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867236.0022216-15579-23851170836093/ > /dev/null 2>&1 && sleep 0' 15247 1726867236.48118: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867236.48141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867236.48172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867236.48216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867236.50484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867236.50487: stdout chunk (state=3): >>><<< 15247 1726867236.50490: stderr chunk (state=3): >>><<< 15247 1726867236.50493: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867236.50495: handler run complete 15247 1726867236.50497: Evaluated conditional (False): False 15247 1726867236.50499: attempt loop complete, returning result 15247 1726867236.50501: _execute() done 15247 1726867236.50502: dumping result to json 15247 1726867236.50507: done dumping result, returning 15247 1726867236.50509: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [0affcac9-a3a5-8ce3-1923-0000000000fd] 15247 1726867236.50511: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000fd 15247 1726867236.50791: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000fd 15247 1726867236.50795: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003291", "end": "2024-09-20 17:20:36.454353", "rc": 0, "start": "2024-09-20 17:20:36.451062" } STDOUT: bonding_masters eth0 lo 15247 1726867236.50880: no more pending results, returning what we have 15247 1726867236.50884: results queue empty 15247 1726867236.50885: checking for any_errors_fatal 15247 1726867236.50887: done checking for any_errors_fatal 15247 1726867236.50887: checking for max_fail_percentage 15247 1726867236.50889: done checking for max_fail_percentage 15247 1726867236.50890: checking to see if all hosts have failed and the running result is not ok 15247 1726867236.50891: done checking to see if all hosts have failed 15247 1726867236.50892: getting the remaining hosts for this loop 15247 1726867236.50893: done getting the remaining hosts for this loop 15247 1726867236.50897: getting the next task for host managed_node2 15247 1726867236.50907: done getting next task for host managed_node2 15247 1726867236.50910: ^ task is: TASK: Set current_interfaces 15247 1726867236.50914: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867236.50918: getting variables 15247 1726867236.50920: in VariableManager get_vars() 15247 1726867236.50953: Calling all_inventory to load vars for managed_node2 15247 1726867236.50956: Calling groups_inventory to load vars for managed_node2 15247 1726867236.50960: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867236.50972: Calling all_plugins_play to load vars for managed_node2 15247 1726867236.50975: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867236.51200: Calling groups_plugins_play to load vars for managed_node2 15247 1726867236.51818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867236.52219: done with get_vars() 15247 1726867236.52229: done getting variables 15247 1726867236.52305: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 17:20:36 -0400 (0:00:00.581) 0:00:06.233 ****** 15247 1726867236.52380: entering _queue_task() for managed_node2/set_fact 15247 1726867236.52839: worker is 1 (out of 1 available) 15247 1726867236.52851: exiting _queue_task() for managed_node2/set_fact 15247 1726867236.52862: done queuing things up, now waiting for results queue to drain 15247 1726867236.52863: waiting for pending results... 15247 1726867236.53596: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 15247 1726867236.53601: in run() - task 0affcac9-a3a5-8ce3-1923-0000000000fe 15247 1726867236.53608: variable 'ansible_search_path' from source: unknown 15247 1726867236.53611: variable 'ansible_search_path' from source: unknown 15247 1726867236.53613: calling self._execute() 15247 1726867236.54285: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867236.54288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867236.54291: variable 'omit' from source: magic vars 15247 1726867236.55582: variable 'ansible_distribution_major_version' from source: facts 15247 1726867236.55586: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867236.55589: variable 'omit' from source: magic vars 15247 1726867236.55591: variable 'omit' from source: magic vars 15247 1726867236.56009: variable '_current_interfaces' from source: set_fact 15247 1726867236.56073: variable 'omit' from source: magic vars 15247 1726867236.56323: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867236.56362: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867236.56607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867236.56630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867236.56645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867236.56676: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867236.56811: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867236.56821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867236.57149: Set connection var ansible_shell_executable to /bin/sh 15247 1726867236.57156: Set connection var ansible_connection to ssh 15247 1726867236.57164: Set connection var ansible_shell_type to sh 15247 1726867236.57174: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867236.57189: Set connection var ansible_timeout to 10 15247 1726867236.57199: Set connection var ansible_pipelining to False 15247 1726867236.57230: variable 'ansible_shell_executable' from source: unknown 15247 1726867236.57682: variable 'ansible_connection' from source: unknown 15247 1726867236.57687: variable 'ansible_module_compression' from source: unknown 15247 1726867236.57690: variable 'ansible_shell_type' from source: unknown 15247 1726867236.57692: variable 'ansible_shell_executable' from source: unknown 15247 1726867236.57694: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867236.57696: variable 'ansible_pipelining' from source: unknown 15247 1726867236.57698: variable 'ansible_timeout' from source: unknown 15247 1726867236.57700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867236.58033: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867236.58037: variable 'omit' from source: magic vars 15247 1726867236.58039: starting attempt loop 15247 1726867236.58042: running the handler 15247 1726867236.58044: handler run complete 15247 1726867236.58045: attempt loop complete, returning result 15247 1726867236.58047: _execute() done 15247 1726867236.58050: dumping result to json 15247 1726867236.58052: done dumping result, returning 15247 1726867236.58054: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [0affcac9-a3a5-8ce3-1923-0000000000fe] 15247 1726867236.58056: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000fe ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 15247 1726867236.58309: no more pending results, returning what we have 15247 1726867236.58313: results queue empty 15247 1726867236.58314: checking for any_errors_fatal 15247 1726867236.58323: done checking for any_errors_fatal 15247 1726867236.58324: checking for max_fail_percentage 15247 1726867236.58326: done checking for max_fail_percentage 15247 1726867236.58327: checking to see if all hosts have failed and the running result is not ok 15247 1726867236.58328: done checking to see if all hosts have failed 15247 1726867236.58329: getting the remaining hosts for this loop 15247 1726867236.58330: done getting the remaining hosts for this loop 15247 1726867236.58333: getting the next task for host managed_node2 15247 1726867236.58343: done getting next task for host managed_node2 15247 1726867236.58345: ^ task is: TASK: Show current_interfaces 15247 1726867236.58349: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867236.58354: getting variables 15247 1726867236.58356: in VariableManager get_vars() 15247 1726867236.58390: Calling all_inventory to load vars for managed_node2 15247 1726867236.58393: Calling groups_inventory to load vars for managed_node2 15247 1726867236.58397: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867236.58413: Calling all_plugins_play to load vars for managed_node2 15247 1726867236.58416: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867236.58419: Calling groups_plugins_play to load vars for managed_node2 15247 1726867236.59099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867236.59766: done with get_vars() 15247 1726867236.59775: done getting variables 15247 1726867236.59913: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000fe 15247 1726867236.59916: WORKER PROCESS EXITING 15247 1726867236.60148: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 17:20:36 -0400 (0:00:00.077) 0:00:06.311 ****** 15247 1726867236.60184: entering _queue_task() for managed_node2/debug 15247 1726867236.60186: Creating lock for debug 15247 1726867236.60622: worker is 1 (out of 1 available) 15247 1726867236.60634: exiting _queue_task() for managed_node2/debug 15247 1726867236.60643: done queuing things up, now waiting for results queue to drain 15247 1726867236.60644: waiting for pending results... 15247 1726867236.60858: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 15247 1726867236.60954: in run() - task 0affcac9-a3a5-8ce3-1923-0000000000ef 15247 1726867236.60981: variable 'ansible_search_path' from source: unknown 15247 1726867236.60990: variable 'ansible_search_path' from source: unknown 15247 1726867236.61033: calling self._execute() 15247 1726867236.61115: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867236.61127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867236.61141: variable 'omit' from source: magic vars 15247 1726867236.61576: variable 'ansible_distribution_major_version' from source: facts 15247 1726867236.61596: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867236.61615: variable 'omit' from source: magic vars 15247 1726867236.61658: variable 'omit' from source: magic vars 15247 1726867236.61768: variable 'current_interfaces' from source: set_fact 15247 1726867236.61799: variable 'omit' from source: magic vars 15247 1726867236.61844: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867236.61881: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867236.61906: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867236.61926: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867236.61944: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867236.61975: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867236.61985: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867236.61993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867236.62102: Set connection var ansible_shell_executable to /bin/sh 15247 1726867236.62112: Set connection var ansible_connection to ssh 15247 1726867236.62118: Set connection var ansible_shell_type to sh 15247 1726867236.62127: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867236.62136: Set connection var ansible_timeout to 10 15247 1726867236.62143: Set connection var ansible_pipelining to False 15247 1726867236.62173: variable 'ansible_shell_executable' from source: unknown 15247 1726867236.62382: variable 'ansible_connection' from source: unknown 15247 1726867236.62385: variable 'ansible_module_compression' from source: unknown 15247 1726867236.62387: variable 'ansible_shell_type' from source: unknown 15247 1726867236.62390: variable 'ansible_shell_executable' from source: unknown 15247 1726867236.62392: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867236.62393: variable 'ansible_pipelining' from source: unknown 15247 1726867236.62395: variable 'ansible_timeout' from source: unknown 15247 1726867236.62397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867236.62400: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867236.62403: variable 'omit' from source: magic vars 15247 1726867236.62407: starting attempt loop 15247 1726867236.62410: running the handler 15247 1726867236.62431: handler run complete 15247 1726867236.62454: attempt loop complete, returning result 15247 1726867236.62461: _execute() done 15247 1726867236.62469: dumping result to json 15247 1726867236.62479: done dumping result, returning 15247 1726867236.62492: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [0affcac9-a3a5-8ce3-1923-0000000000ef] 15247 1726867236.62502: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000ef 15247 1726867236.62612: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000000ef 15247 1726867236.62621: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 15247 1726867236.62674: no more pending results, returning what we have 15247 1726867236.62679: results queue empty 15247 1726867236.62680: checking for any_errors_fatal 15247 1726867236.62684: done checking for any_errors_fatal 15247 1726867236.62685: checking for max_fail_percentage 15247 1726867236.62687: done checking for max_fail_percentage 15247 1726867236.62688: checking to see if all hosts have failed and the running result is not ok 15247 1726867236.62689: done checking to see if all hosts have failed 15247 1726867236.62690: getting the remaining hosts for this loop 15247 1726867236.62691: done getting the remaining hosts for this loop 15247 1726867236.62695: getting the next task for host managed_node2 15247 1726867236.62706: done getting next task for host managed_node2 15247 1726867236.62710: ^ task is: TASK: Include the task 'assert_device_absent.yml' 15247 1726867236.62711: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867236.62715: getting variables 15247 1726867236.62717: in VariableManager get_vars() 15247 1726867236.62745: Calling all_inventory to load vars for managed_node2 15247 1726867236.62748: Calling groups_inventory to load vars for managed_node2 15247 1726867236.62752: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867236.62764: Calling all_plugins_play to load vars for managed_node2 15247 1726867236.62767: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867236.62770: Calling groups_plugins_play to load vars for managed_node2 15247 1726867236.63088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867236.63490: done with get_vars() 15247 1726867236.63499: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:14 Friday 20 September 2024 17:20:36 -0400 (0:00:00.034) 0:00:06.345 ****** 15247 1726867236.63593: entering _queue_task() for managed_node2/include_tasks 15247 1726867236.63834: worker is 1 (out of 1 available) 15247 1726867236.63846: exiting _queue_task() for managed_node2/include_tasks 15247 1726867236.63858: done queuing things up, now waiting for results queue to drain 15247 1726867236.63859: waiting for pending results... 15247 1726867236.64070: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_absent.yml' 15247 1726867236.64184: in run() - task 0affcac9-a3a5-8ce3-1923-00000000000d 15247 1726867236.64188: variable 'ansible_search_path' from source: unknown 15247 1726867236.64270: calling self._execute() 15247 1726867236.64297: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867236.64309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867236.64322: variable 'omit' from source: magic vars 15247 1726867236.64670: variable 'ansible_distribution_major_version' from source: facts 15247 1726867236.64691: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867236.64707: _execute() done 15247 1726867236.64716: dumping result to json 15247 1726867236.64723: done dumping result, returning 15247 1726867236.64731: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_absent.yml' [0affcac9-a3a5-8ce3-1923-00000000000d] 15247 1726867236.64740: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000000d 15247 1726867236.64866: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000000d 15247 1726867236.64870: WORKER PROCESS EXITING 15247 1726867236.64896: no more pending results, returning what we have 15247 1726867236.64901: in VariableManager get_vars() 15247 1726867236.64934: Calling all_inventory to load vars for managed_node2 15247 1726867236.64937: Calling groups_inventory to load vars for managed_node2 15247 1726867236.64940: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867236.64951: Calling all_plugins_play to load vars for managed_node2 15247 1726867236.64954: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867236.64957: Calling groups_plugins_play to load vars for managed_node2 15247 1726867236.65129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867236.65317: done with get_vars() 15247 1726867236.65325: variable 'ansible_search_path' from source: unknown 15247 1726867236.65337: we have included files to process 15247 1726867236.65338: generating all_blocks data 15247 1726867236.65340: done generating all_blocks data 15247 1726867236.65346: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15247 1726867236.65347: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15247 1726867236.65350: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15247 1726867236.65512: in VariableManager get_vars() 15247 1726867236.65528: done with get_vars() 15247 1726867236.65698: done processing included file 15247 1726867236.65700: iterating over new_blocks loaded from include file 15247 1726867236.65702: in VariableManager get_vars() 15247 1726867236.65717: done with get_vars() 15247 1726867236.65718: filtering new block on tags 15247 1726867236.65735: done filtering new block on tags 15247 1726867236.65737: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 15247 1726867236.65742: extending task lists for all hosts with included blocks 15247 1726867236.65902: done extending task lists 15247 1726867236.65906: done processing included files 15247 1726867236.65907: results queue empty 15247 1726867236.65907: checking for any_errors_fatal 15247 1726867236.65910: done checking for any_errors_fatal 15247 1726867236.65911: checking for max_fail_percentage 15247 1726867236.65912: done checking for max_fail_percentage 15247 1726867236.65913: checking to see if all hosts have failed and the running result is not ok 15247 1726867236.65914: done checking to see if all hosts have failed 15247 1726867236.65915: getting the remaining hosts for this loop 15247 1726867236.65916: done getting the remaining hosts for this loop 15247 1726867236.65919: getting the next task for host managed_node2 15247 1726867236.65922: done getting next task for host managed_node2 15247 1726867236.65926: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15247 1726867236.65928: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867236.65931: getting variables 15247 1726867236.65932: in VariableManager get_vars() 15247 1726867236.65939: Calling all_inventory to load vars for managed_node2 15247 1726867236.65942: Calling groups_inventory to load vars for managed_node2 15247 1726867236.65944: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867236.65949: Calling all_plugins_play to load vars for managed_node2 15247 1726867236.65951: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867236.65954: Calling groups_plugins_play to load vars for managed_node2 15247 1726867236.66090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867236.66272: done with get_vars() 15247 1726867236.66283: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 17:20:36 -0400 (0:00:00.027) 0:00:06.373 ****** 15247 1726867236.66356: entering _queue_task() for managed_node2/include_tasks 15247 1726867236.66620: worker is 1 (out of 1 available) 15247 1726867236.66632: exiting _queue_task() for managed_node2/include_tasks 15247 1726867236.66644: done queuing things up, now waiting for results queue to drain 15247 1726867236.66645: waiting for pending results... 15247 1726867236.67096: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 15247 1726867236.67101: in run() - task 0affcac9-a3a5-8ce3-1923-000000000119 15247 1726867236.67107: variable 'ansible_search_path' from source: unknown 15247 1726867236.67109: variable 'ansible_search_path' from source: unknown 15247 1726867236.67112: calling self._execute() 15247 1726867236.67134: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867236.67146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867236.67162: variable 'omit' from source: magic vars 15247 1726867236.67521: variable 'ansible_distribution_major_version' from source: facts 15247 1726867236.67537: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867236.67546: _execute() done 15247 1726867236.67557: dumping result to json 15247 1726867236.67563: done dumping result, returning 15247 1726867236.67571: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-8ce3-1923-000000000119] 15247 1726867236.67582: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000119 15247 1726867236.67703: no more pending results, returning what we have 15247 1726867236.67710: in VariableManager get_vars() 15247 1726867236.67743: Calling all_inventory to load vars for managed_node2 15247 1726867236.67745: Calling groups_inventory to load vars for managed_node2 15247 1726867236.67749: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867236.67761: Calling all_plugins_play to load vars for managed_node2 15247 1726867236.67763: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867236.67779: Calling groups_plugins_play to load vars for managed_node2 15247 1726867236.68185: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000119 15247 1726867236.68189: WORKER PROCESS EXITING 15247 1726867236.68215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867236.68446: done with get_vars() 15247 1726867236.68455: variable 'ansible_search_path' from source: unknown 15247 1726867236.68456: variable 'ansible_search_path' from source: unknown 15247 1726867236.68492: we have included files to process 15247 1726867236.68493: generating all_blocks data 15247 1726867236.68495: done generating all_blocks data 15247 1726867236.68496: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15247 1726867236.68497: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15247 1726867236.68499: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15247 1726867236.68714: done processing included file 15247 1726867236.68716: iterating over new_blocks loaded from include file 15247 1726867236.68718: in VariableManager get_vars() 15247 1726867236.68729: done with get_vars() 15247 1726867236.68731: filtering new block on tags 15247 1726867236.68745: done filtering new block on tags 15247 1726867236.68747: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 15247 1726867236.68752: extending task lists for all hosts with included blocks 15247 1726867236.68854: done extending task lists 15247 1726867236.68855: done processing included files 15247 1726867236.68856: results queue empty 15247 1726867236.68856: checking for any_errors_fatal 15247 1726867236.68859: done checking for any_errors_fatal 15247 1726867236.68859: checking for max_fail_percentage 15247 1726867236.68861: done checking for max_fail_percentage 15247 1726867236.68861: checking to see if all hosts have failed and the running result is not ok 15247 1726867236.68862: done checking to see if all hosts have failed 15247 1726867236.68863: getting the remaining hosts for this loop 15247 1726867236.68864: done getting the remaining hosts for this loop 15247 1726867236.68867: getting the next task for host managed_node2 15247 1726867236.68871: done getting next task for host managed_node2 15247 1726867236.68873: ^ task is: TASK: Get stat for interface {{ interface }} 15247 1726867236.68876: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867236.68880: getting variables 15247 1726867236.68881: in VariableManager get_vars() 15247 1726867236.68888: Calling all_inventory to load vars for managed_node2 15247 1726867236.68890: Calling groups_inventory to load vars for managed_node2 15247 1726867236.68893: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867236.68897: Calling all_plugins_play to load vars for managed_node2 15247 1726867236.68899: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867236.68902: Calling groups_plugins_play to load vars for managed_node2 15247 1726867236.69040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867236.69221: done with get_vars() 15247 1726867236.69230: done getting variables 15247 1726867236.69384: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:20:36 -0400 (0:00:00.030) 0:00:06.403 ****** 15247 1726867236.69413: entering _queue_task() for managed_node2/stat 15247 1726867236.69646: worker is 1 (out of 1 available) 15247 1726867236.69658: exiting _queue_task() for managed_node2/stat 15247 1726867236.69882: done queuing things up, now waiting for results queue to drain 15247 1726867236.69884: waiting for pending results... 15247 1726867236.69947: running TaskExecutor() for managed_node2/TASK: Get stat for interface LSR-TST-br31 15247 1726867236.70046: in run() - task 0affcac9-a3a5-8ce3-1923-000000000133 15247 1726867236.70064: variable 'ansible_search_path' from source: unknown 15247 1726867236.70070: variable 'ansible_search_path' from source: unknown 15247 1726867236.70115: calling self._execute() 15247 1726867236.70186: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867236.70199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867236.70220: variable 'omit' from source: magic vars 15247 1726867236.70922: variable 'ansible_distribution_major_version' from source: facts 15247 1726867236.70938: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867236.70948: variable 'omit' from source: magic vars 15247 1726867236.71007: variable 'omit' from source: magic vars 15247 1726867236.71192: variable 'interface' from source: set_fact 15247 1726867236.71195: variable 'omit' from source: magic vars 15247 1726867236.71198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867236.71210: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867236.71233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867236.71257: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867236.71272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867236.71314: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867236.71323: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867236.71331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867236.71441: Set connection var ansible_shell_executable to /bin/sh 15247 1726867236.71450: Set connection var ansible_connection to ssh 15247 1726867236.71457: Set connection var ansible_shell_type to sh 15247 1726867236.71466: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867236.71475: Set connection var ansible_timeout to 10 15247 1726867236.71485: Set connection var ansible_pipelining to False 15247 1726867236.71514: variable 'ansible_shell_executable' from source: unknown 15247 1726867236.71524: variable 'ansible_connection' from source: unknown 15247 1726867236.71530: variable 'ansible_module_compression' from source: unknown 15247 1726867236.71535: variable 'ansible_shell_type' from source: unknown 15247 1726867236.71682: variable 'ansible_shell_executable' from source: unknown 15247 1726867236.71685: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867236.71688: variable 'ansible_pipelining' from source: unknown 15247 1726867236.71690: variable 'ansible_timeout' from source: unknown 15247 1726867236.71692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867236.71757: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867236.71772: variable 'omit' from source: magic vars 15247 1726867236.71787: starting attempt loop 15247 1726867236.71795: running the handler 15247 1726867236.71820: _low_level_execute_command(): starting 15247 1726867236.71833: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867236.72597: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867236.72630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867236.72704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867236.72733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867236.72752: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867236.72833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867236.74498: stdout chunk (state=3): >>>/root <<< 15247 1726867236.74665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867236.74699: stdout chunk (state=3): >>><<< 15247 1726867236.74702: stderr chunk (state=3): >>><<< 15247 1726867236.74727: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867236.74825: _low_level_execute_command(): starting 15247 1726867236.74829: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867236.7473357-15631-41701594103588 `" && echo ansible-tmp-1726867236.7473357-15631-41701594103588="` echo /root/.ansible/tmp/ansible-tmp-1726867236.7473357-15631-41701594103588 `" ) && sleep 0' 15247 1726867236.75414: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867236.75427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867236.75439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867236.75453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867236.75548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867236.75574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867236.75643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867236.77565: stdout chunk (state=3): >>>ansible-tmp-1726867236.7473357-15631-41701594103588=/root/.ansible/tmp/ansible-tmp-1726867236.7473357-15631-41701594103588 <<< 15247 1726867236.78107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867236.78111: stdout chunk (state=3): >>><<< 15247 1726867236.78113: stderr chunk (state=3): >>><<< 15247 1726867236.78116: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867236.7473357-15631-41701594103588=/root/.ansible/tmp/ansible-tmp-1726867236.7473357-15631-41701594103588 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867236.78118: variable 'ansible_module_compression' from source: unknown 15247 1726867236.78285: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15247 1726867236.78332: variable 'ansible_facts' from source: unknown 15247 1726867236.78438: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867236.7473357-15631-41701594103588/AnsiballZ_stat.py 15247 1726867236.78748: Sending initial data 15247 1726867236.78752: Sent initial data (152 bytes) 15247 1726867236.79583: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867236.79587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867236.79589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867236.79594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867236.79604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867236.79661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867236.79673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867236.79731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867236.81341: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867236.81427: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867236.81487: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp2b0cclyd /root/.ansible/tmp/ansible-tmp-1726867236.7473357-15631-41701594103588/AnsiballZ_stat.py <<< 15247 1726867236.81490: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867236.7473357-15631-41701594103588/AnsiballZ_stat.py" <<< 15247 1726867236.81541: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp2b0cclyd" to remote "/root/.ansible/tmp/ansible-tmp-1726867236.7473357-15631-41701594103588/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867236.7473357-15631-41701594103588/AnsiballZ_stat.py" <<< 15247 1726867236.82771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867236.82840: stderr chunk (state=3): >>><<< 15247 1726867236.82852: stdout chunk (state=3): >>><<< 15247 1726867236.83363: done transferring module to remote 15247 1726867236.83366: _low_level_execute_command(): starting 15247 1726867236.83368: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867236.7473357-15631-41701594103588/ /root/.ansible/tmp/ansible-tmp-1726867236.7473357-15631-41701594103588/AnsiballZ_stat.py && sleep 0' 15247 1726867236.84537: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867236.84674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867236.84743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867236.86569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867236.86611: stderr chunk (state=3): >>><<< 15247 1726867236.86621: stdout chunk (state=3): >>><<< 15247 1726867236.86642: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867236.86691: _low_level_execute_command(): starting 15247 1726867236.86702: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867236.7473357-15631-41701594103588/AnsiballZ_stat.py && sleep 0' 15247 1726867236.87855: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867236.87894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867236.88125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867236.88197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867236.88200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867237.03782: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15247 1726867237.05090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867237.05111: stderr chunk (state=3): >>><<< 15247 1726867237.05114: stdout chunk (state=3): >>><<< 15247 1726867237.05128: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867237.05148: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867236.7473357-15631-41701594103588/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867237.05159: _low_level_execute_command(): starting 15247 1726867237.05163: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867236.7473357-15631-41701594103588/ > /dev/null 2>&1 && sleep 0' 15247 1726867237.05579: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867237.05585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867237.05589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 15247 1726867237.05592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867237.05593: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867237.05642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867237.05646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867237.05692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867237.07541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867237.07563: stderr chunk (state=3): >>><<< 15247 1726867237.07566: stdout chunk (state=3): >>><<< 15247 1726867237.07579: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867237.07585: handler run complete 15247 1726867237.07600: attempt loop complete, returning result 15247 1726867237.07602: _execute() done 15247 1726867237.07605: dumping result to json 15247 1726867237.07611: done dumping result, returning 15247 1726867237.07619: done running TaskExecutor() for managed_node2/TASK: Get stat for interface LSR-TST-br31 [0affcac9-a3a5-8ce3-1923-000000000133] 15247 1726867237.07625: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000133 15247 1726867237.07716: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000133 15247 1726867237.07718: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 15247 1726867237.08042: no more pending results, returning what we have 15247 1726867237.08046: results queue empty 15247 1726867237.08047: checking for any_errors_fatal 15247 1726867237.08049: done checking for any_errors_fatal 15247 1726867237.08050: checking for max_fail_percentage 15247 1726867237.08051: done checking for max_fail_percentage 15247 1726867237.08052: checking to see if all hosts have failed and the running result is not ok 15247 1726867237.08054: done checking to see if all hosts have failed 15247 1726867237.08055: getting the remaining hosts for this loop 15247 1726867237.08055: done getting the remaining hosts for this loop 15247 1726867237.08058: getting the next task for host managed_node2 15247 1726867237.08062: done getting next task for host managed_node2 15247 1726867237.08064: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 15247 1726867237.08065: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867237.08067: getting variables 15247 1726867237.08068: in VariableManager get_vars() 15247 1726867237.08087: Calling all_inventory to load vars for managed_node2 15247 1726867237.08089: Calling groups_inventory to load vars for managed_node2 15247 1726867237.08091: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867237.08098: Calling all_plugins_play to load vars for managed_node2 15247 1726867237.08099: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867237.08101: Calling groups_plugins_play to load vars for managed_node2 15247 1726867237.08191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867237.08302: done with get_vars() 15247 1726867237.08309: done getting variables 15247 1726867237.08386: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 15247 1726867237.08466: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 17:20:37 -0400 (0:00:00.390) 0:00:06.794 ****** 15247 1726867237.08489: entering _queue_task() for managed_node2/assert 15247 1726867237.08490: Creating lock for assert 15247 1726867237.08679: worker is 1 (out of 1 available) 15247 1726867237.08690: exiting _queue_task() for managed_node2/assert 15247 1726867237.08701: done queuing things up, now waiting for results queue to drain 15247 1726867237.08702: waiting for pending results... 15247 1726867237.08852: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' 15247 1726867237.08916: in run() - task 0affcac9-a3a5-8ce3-1923-00000000011a 15247 1726867237.08930: variable 'ansible_search_path' from source: unknown 15247 1726867237.08933: variable 'ansible_search_path' from source: unknown 15247 1726867237.08957: calling self._execute() 15247 1726867237.09014: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867237.09018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867237.09026: variable 'omit' from source: magic vars 15247 1726867237.09282: variable 'ansible_distribution_major_version' from source: facts 15247 1726867237.09292: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867237.09297: variable 'omit' from source: magic vars 15247 1726867237.09326: variable 'omit' from source: magic vars 15247 1726867237.09392: variable 'interface' from source: set_fact 15247 1726867237.09405: variable 'omit' from source: magic vars 15247 1726867237.09439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867237.09464: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867237.09483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867237.09496: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867237.09505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867237.09529: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867237.09533: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867237.09536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867237.09602: Set connection var ansible_shell_executable to /bin/sh 15247 1726867237.09606: Set connection var ansible_connection to ssh 15247 1726867237.09608: Set connection var ansible_shell_type to sh 15247 1726867237.09615: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867237.09622: Set connection var ansible_timeout to 10 15247 1726867237.09626: Set connection var ansible_pipelining to False 15247 1726867237.09644: variable 'ansible_shell_executable' from source: unknown 15247 1726867237.09647: variable 'ansible_connection' from source: unknown 15247 1726867237.09650: variable 'ansible_module_compression' from source: unknown 15247 1726867237.09653: variable 'ansible_shell_type' from source: unknown 15247 1726867237.09655: variable 'ansible_shell_executable' from source: unknown 15247 1726867237.09657: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867237.09662: variable 'ansible_pipelining' from source: unknown 15247 1726867237.09665: variable 'ansible_timeout' from source: unknown 15247 1726867237.09667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867237.09762: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867237.09771: variable 'omit' from source: magic vars 15247 1726867237.09776: starting attempt loop 15247 1726867237.09781: running the handler 15247 1726867237.10083: variable 'interface_stat' from source: set_fact 15247 1726867237.10086: Evaluated conditional (not interface_stat.stat.exists): True 15247 1726867237.10088: handler run complete 15247 1726867237.10090: attempt loop complete, returning result 15247 1726867237.10093: _execute() done 15247 1726867237.10095: dumping result to json 15247 1726867237.10098: done dumping result, returning 15247 1726867237.10101: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' [0affcac9-a3a5-8ce3-1923-00000000011a] 15247 1726867237.10106: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000011a 15247 1726867237.10164: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000011a 15247 1726867237.10168: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 15247 1726867237.10210: no more pending results, returning what we have 15247 1726867237.10213: results queue empty 15247 1726867237.10213: checking for any_errors_fatal 15247 1726867237.10218: done checking for any_errors_fatal 15247 1726867237.10219: checking for max_fail_percentage 15247 1726867237.10220: done checking for max_fail_percentage 15247 1726867237.10220: checking to see if all hosts have failed and the running result is not ok 15247 1726867237.10221: done checking to see if all hosts have failed 15247 1726867237.10222: getting the remaining hosts for this loop 15247 1726867237.10223: done getting the remaining hosts for this loop 15247 1726867237.10226: getting the next task for host managed_node2 15247 1726867237.10232: done getting next task for host managed_node2 15247 1726867237.10234: ^ task is: TASK: meta (flush_handlers) 15247 1726867237.10236: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867237.10239: getting variables 15247 1726867237.10240: in VariableManager get_vars() 15247 1726867237.10261: Calling all_inventory to load vars for managed_node2 15247 1726867237.10262: Calling groups_inventory to load vars for managed_node2 15247 1726867237.10265: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867237.10273: Calling all_plugins_play to load vars for managed_node2 15247 1726867237.10275: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867237.10294: Calling groups_plugins_play to load vars for managed_node2 15247 1726867237.10446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867237.10648: done with get_vars() 15247 1726867237.10655: done getting variables 15247 1726867237.10712: in VariableManager get_vars() 15247 1726867237.10719: Calling all_inventory to load vars for managed_node2 15247 1726867237.10721: Calling groups_inventory to load vars for managed_node2 15247 1726867237.10723: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867237.10726: Calling all_plugins_play to load vars for managed_node2 15247 1726867237.10728: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867237.10731: Calling groups_plugins_play to load vars for managed_node2 15247 1726867237.10854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867237.11026: done with get_vars() 15247 1726867237.11037: done queuing things up, now waiting for results queue to drain 15247 1726867237.11038: results queue empty 15247 1726867237.11039: checking for any_errors_fatal 15247 1726867237.11041: done checking for any_errors_fatal 15247 1726867237.11041: checking for max_fail_percentage 15247 1726867237.11042: done checking for max_fail_percentage 15247 1726867237.11043: checking to see if all hosts have failed and the running result is not ok 15247 1726867237.11044: done checking to see if all hosts have failed 15247 1726867237.11048: getting the remaining hosts for this loop 15247 1726867237.11049: done getting the remaining hosts for this loop 15247 1726867237.11051: getting the next task for host managed_node2 15247 1726867237.11054: done getting next task for host managed_node2 15247 1726867237.11055: ^ task is: TASK: meta (flush_handlers) 15247 1726867237.11056: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867237.11058: getting variables 15247 1726867237.11059: in VariableManager get_vars() 15247 1726867237.11065: Calling all_inventory to load vars for managed_node2 15247 1726867237.11067: Calling groups_inventory to load vars for managed_node2 15247 1726867237.11069: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867237.11072: Calling all_plugins_play to load vars for managed_node2 15247 1726867237.11074: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867237.11079: Calling groups_plugins_play to load vars for managed_node2 15247 1726867237.11200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867237.11388: done with get_vars() 15247 1726867237.11394: done getting variables 15247 1726867237.11433: in VariableManager get_vars() 15247 1726867237.11439: Calling all_inventory to load vars for managed_node2 15247 1726867237.11441: Calling groups_inventory to load vars for managed_node2 15247 1726867237.11443: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867237.11446: Calling all_plugins_play to load vars for managed_node2 15247 1726867237.11448: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867237.11451: Calling groups_plugins_play to load vars for managed_node2 15247 1726867237.11574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867237.11741: done with get_vars() 15247 1726867237.11750: done queuing things up, now waiting for results queue to drain 15247 1726867237.11751: results queue empty 15247 1726867237.11752: checking for any_errors_fatal 15247 1726867237.11753: done checking for any_errors_fatal 15247 1726867237.11754: checking for max_fail_percentage 15247 1726867237.11755: done checking for max_fail_percentage 15247 1726867237.11755: checking to see if all hosts have failed and the running result is not ok 15247 1726867237.11756: done checking to see if all hosts have failed 15247 1726867237.11757: getting the remaining hosts for this loop 15247 1726867237.11757: done getting the remaining hosts for this loop 15247 1726867237.11759: getting the next task for host managed_node2 15247 1726867237.11761: done getting next task for host managed_node2 15247 1726867237.11762: ^ task is: None 15247 1726867237.11763: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867237.11764: done queuing things up, now waiting for results queue to drain 15247 1726867237.11765: results queue empty 15247 1726867237.11766: checking for any_errors_fatal 15247 1726867237.11766: done checking for any_errors_fatal 15247 1726867237.11767: checking for max_fail_percentage 15247 1726867237.11768: done checking for max_fail_percentage 15247 1726867237.11768: checking to see if all hosts have failed and the running result is not ok 15247 1726867237.11769: done checking to see if all hosts have failed 15247 1726867237.11771: getting the next task for host managed_node2 15247 1726867237.11772: done getting next task for host managed_node2 15247 1726867237.11773: ^ task is: None 15247 1726867237.11774: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867237.11810: in VariableManager get_vars() 15247 1726867237.11827: done with get_vars() 15247 1726867237.11832: in VariableManager get_vars() 15247 1726867237.11842: done with get_vars() 15247 1726867237.11845: variable 'omit' from source: magic vars 15247 1726867237.11869: in VariableManager get_vars() 15247 1726867237.11881: done with get_vars() 15247 1726867237.11897: variable 'omit' from source: magic vars PLAY [Add test bridge] ********************************************************* 15247 1726867237.12298: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15247 1726867237.12318: getting the remaining hosts for this loop 15247 1726867237.12319: done getting the remaining hosts for this loop 15247 1726867237.12321: getting the next task for host managed_node2 15247 1726867237.12322: done getting next task for host managed_node2 15247 1726867237.12323: ^ task is: TASK: Gathering Facts 15247 1726867237.12324: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867237.12325: getting variables 15247 1726867237.12326: in VariableManager get_vars() 15247 1726867237.12333: Calling all_inventory to load vars for managed_node2 15247 1726867237.12334: Calling groups_inventory to load vars for managed_node2 15247 1726867237.12335: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867237.12338: Calling all_plugins_play to load vars for managed_node2 15247 1726867237.12339: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867237.12341: Calling groups_plugins_play to load vars for managed_node2 15247 1726867237.12419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867237.12525: done with get_vars() 15247 1726867237.12530: done getting variables 15247 1726867237.12553: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Friday 20 September 2024 17:20:37 -0400 (0:00:00.040) 0:00:06.835 ****** 15247 1726867237.12568: entering _queue_task() for managed_node2/gather_facts 15247 1726867237.12730: worker is 1 (out of 1 available) 15247 1726867237.12740: exiting _queue_task() for managed_node2/gather_facts 15247 1726867237.12751: done queuing things up, now waiting for results queue to drain 15247 1726867237.12752: waiting for pending results... 15247 1726867237.12895: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15247 1726867237.12953: in run() - task 0affcac9-a3a5-8ce3-1923-00000000014c 15247 1726867237.12965: variable 'ansible_search_path' from source: unknown 15247 1726867237.12995: calling self._execute() 15247 1726867237.13052: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867237.13056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867237.13066: variable 'omit' from source: magic vars 15247 1726867237.13317: variable 'ansible_distribution_major_version' from source: facts 15247 1726867237.13325: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867237.13331: variable 'omit' from source: magic vars 15247 1726867237.13347: variable 'omit' from source: magic vars 15247 1726867237.13371: variable 'omit' from source: magic vars 15247 1726867237.13401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867237.13431: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867237.13445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867237.13459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867237.13467: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867237.13491: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867237.13494: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867237.13496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867237.13564: Set connection var ansible_shell_executable to /bin/sh 15247 1726867237.13567: Set connection var ansible_connection to ssh 15247 1726867237.13570: Set connection var ansible_shell_type to sh 15247 1726867237.13572: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867237.13580: Set connection var ansible_timeout to 10 15247 1726867237.13585: Set connection var ansible_pipelining to False 15247 1726867237.13603: variable 'ansible_shell_executable' from source: unknown 15247 1726867237.13608: variable 'ansible_connection' from source: unknown 15247 1726867237.13612: variable 'ansible_module_compression' from source: unknown 15247 1726867237.13614: variable 'ansible_shell_type' from source: unknown 15247 1726867237.13616: variable 'ansible_shell_executable' from source: unknown 15247 1726867237.13618: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867237.13621: variable 'ansible_pipelining' from source: unknown 15247 1726867237.13623: variable 'ansible_timeout' from source: unknown 15247 1726867237.13627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867237.13746: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867237.13755: variable 'omit' from source: magic vars 15247 1726867237.13759: starting attempt loop 15247 1726867237.13761: running the handler 15247 1726867237.13774: variable 'ansible_facts' from source: unknown 15247 1726867237.13791: _low_level_execute_command(): starting 15247 1726867237.13797: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867237.14291: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867237.14295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867237.14298: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867237.14301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867237.14350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867237.14353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867237.14356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867237.14401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867237.16078: stdout chunk (state=3): >>>/root <<< 15247 1726867237.16181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867237.16205: stderr chunk (state=3): >>><<< 15247 1726867237.16210: stdout chunk (state=3): >>><<< 15247 1726867237.16230: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867237.16241: _low_level_execute_command(): starting 15247 1726867237.16246: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867237.1622913-15683-264662675125879 `" && echo ansible-tmp-1726867237.1622913-15683-264662675125879="` echo /root/.ansible/tmp/ansible-tmp-1726867237.1622913-15683-264662675125879 `" ) && sleep 0' 15247 1726867237.16647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867237.16650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867237.16659: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 15247 1726867237.16661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867237.16663: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867237.16710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867237.16715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867237.16757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867237.18638: stdout chunk (state=3): >>>ansible-tmp-1726867237.1622913-15683-264662675125879=/root/.ansible/tmp/ansible-tmp-1726867237.1622913-15683-264662675125879 <<< 15247 1726867237.18744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867237.18764: stderr chunk (state=3): >>><<< 15247 1726867237.18767: stdout chunk (state=3): >>><<< 15247 1726867237.18782: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867237.1622913-15683-264662675125879=/root/.ansible/tmp/ansible-tmp-1726867237.1622913-15683-264662675125879 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867237.18802: variable 'ansible_module_compression' from source: unknown 15247 1726867237.18842: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15247 1726867237.18890: variable 'ansible_facts' from source: unknown 15247 1726867237.19021: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867237.1622913-15683-264662675125879/AnsiballZ_setup.py 15247 1726867237.19196: Sending initial data 15247 1726867237.19199: Sent initial data (154 bytes) 15247 1726867237.19575: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867237.19580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867237.19582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867237.19585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867237.19587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867237.19635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867237.19638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867237.19682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867237.21282: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867237.21334: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867237.21395: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpkjepz3gf /root/.ansible/tmp/ansible-tmp-1726867237.1622913-15683-264662675125879/AnsiballZ_setup.py <<< 15247 1726867237.21402: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867237.1622913-15683-264662675125879/AnsiballZ_setup.py" <<< 15247 1726867237.21435: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpkjepz3gf" to remote "/root/.ansible/tmp/ansible-tmp-1726867237.1622913-15683-264662675125879/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867237.1622913-15683-264662675125879/AnsiballZ_setup.py" <<< 15247 1726867237.22463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867237.22495: stderr chunk (state=3): >>><<< 15247 1726867237.22498: stdout chunk (state=3): >>><<< 15247 1726867237.22514: done transferring module to remote 15247 1726867237.22522: _low_level_execute_command(): starting 15247 1726867237.22530: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867237.1622913-15683-264662675125879/ /root/.ansible/tmp/ansible-tmp-1726867237.1622913-15683-264662675125879/AnsiballZ_setup.py && sleep 0' 15247 1726867237.22912: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867237.22915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867237.22928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867237.22942: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867237.22982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867237.22998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867237.23034: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867237.24882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867237.24902: stderr chunk (state=3): >>><<< 15247 1726867237.24907: stdout chunk (state=3): >>><<< 15247 1726867237.24985: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867237.24988: _low_level_execute_command(): starting 15247 1726867237.24991: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867237.1622913-15683-264662675125879/AnsiballZ_setup.py && sleep 0' 15247 1726867237.25511: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867237.25529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867237.25541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867237.25558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867237.25639: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867237.25671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867237.25691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867237.25717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867237.25803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867237.90244: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_loadavg": {"1m": 0.66357421875, "5m": 0.39501953125, "15m": 0.19189453125}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "20", "second": "37", "epoch": "1726867237", "epoch_int": "1726867237", "date": "2024-09-20", "time": "17:20:37", "iso8601_micro": "2024-09-20T21:20:37.571712Z", "iso8601": "2024-09-20T21:20:37Z", "iso8601_basic": "20240920T172037571712", "iso8601_basic_short": "20240920T172037", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2954, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 577, "free": 2954}, "nocache": {"free": 3291, "used": 240}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 475, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796950016, "block_size": 4096, "block_total": 65519099, "block_available": 63915271, "block_used": 1603828, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15247 1726867237.92436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867237.92442: stdout chunk (state=3): >>><<< 15247 1726867237.92445: stderr chunk (state=3): >>><<< 15247 1726867237.92448: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_loadavg": {"1m": 0.66357421875, "5m": 0.39501953125, "15m": 0.19189453125}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "20", "second": "37", "epoch": "1726867237", "epoch_int": "1726867237", "date": "2024-09-20", "time": "17:20:37", "iso8601_micro": "2024-09-20T21:20:37.571712Z", "iso8601": "2024-09-20T21:20:37Z", "iso8601_basic": "20240920T172037571712", "iso8601_basic_short": "20240920T172037", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2954, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 577, "free": 2954}, "nocache": {"free": 3291, "used": 240}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 475, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796950016, "block_size": 4096, "block_total": 65519099, "block_available": 63915271, "block_used": 1603828, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867237.92976: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867237.1622913-15683-264662675125879/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867237.93141: _low_level_execute_command(): starting 15247 1726867237.93145: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867237.1622913-15683-264662675125879/ > /dev/null 2>&1 && sleep 0' 15247 1726867237.94496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867237.94623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867237.94649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867237.94673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867237.94847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867237.96714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867237.96782: stderr chunk (state=3): >>><<< 15247 1726867237.96791: stdout chunk (state=3): >>><<< 15247 1726867237.96900: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867237.96914: handler run complete 15247 1726867237.97144: variable 'ansible_facts' from source: unknown 15247 1726867237.97354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867237.97945: variable 'ansible_facts' from source: unknown 15247 1726867237.98051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867237.98584: attempt loop complete, returning result 15247 1726867237.98587: _execute() done 15247 1726867237.98590: dumping result to json 15247 1726867237.98592: done dumping result, returning 15247 1726867237.98594: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-8ce3-1923-00000000014c] 15247 1726867237.98595: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000014c ok: [managed_node2] 15247 1726867237.99590: no more pending results, returning what we have 15247 1726867237.99594: results queue empty 15247 1726867237.99595: checking for any_errors_fatal 15247 1726867237.99596: done checking for any_errors_fatal 15247 1726867237.99597: checking for max_fail_percentage 15247 1726867237.99598: done checking for max_fail_percentage 15247 1726867237.99599: checking to see if all hosts have failed and the running result is not ok 15247 1726867237.99600: done checking to see if all hosts have failed 15247 1726867237.99600: getting the remaining hosts for this loop 15247 1726867237.99602: done getting the remaining hosts for this loop 15247 1726867237.99605: getting the next task for host managed_node2 15247 1726867237.99611: done getting next task for host managed_node2 15247 1726867237.99612: ^ task is: TASK: meta (flush_handlers) 15247 1726867237.99614: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867237.99618: getting variables 15247 1726867237.99620: in VariableManager get_vars() 15247 1726867237.99652: Calling all_inventory to load vars for managed_node2 15247 1726867237.99655: Calling groups_inventory to load vars for managed_node2 15247 1726867237.99658: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867237.99670: Calling all_plugins_play to load vars for managed_node2 15247 1726867237.99673: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867237.99676: Calling groups_plugins_play to load vars for managed_node2 15247 1726867238.00274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867238.00772: done with get_vars() 15247 1726867238.00985: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000014c 15247 1726867238.00989: WORKER PROCESS EXITING 15247 1726867238.00997: done getting variables 15247 1726867238.01066: in VariableManager get_vars() 15247 1726867238.01079: Calling all_inventory to load vars for managed_node2 15247 1726867238.01082: Calling groups_inventory to load vars for managed_node2 15247 1726867238.01084: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867238.01088: Calling all_plugins_play to load vars for managed_node2 15247 1726867238.01091: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867238.01094: Calling groups_plugins_play to load vars for managed_node2 15247 1726867238.01435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867238.01826: done with get_vars() 15247 1726867238.01840: done queuing things up, now waiting for results queue to drain 15247 1726867238.01842: results queue empty 15247 1726867238.01842: checking for any_errors_fatal 15247 1726867238.01845: done checking for any_errors_fatal 15247 1726867238.01846: checking for max_fail_percentage 15247 1726867238.01847: done checking for max_fail_percentage 15247 1726867238.01848: checking to see if all hosts have failed and the running result is not ok 15247 1726867238.01853: done checking to see if all hosts have failed 15247 1726867238.01854: getting the remaining hosts for this loop 15247 1726867238.01855: done getting the remaining hosts for this loop 15247 1726867238.01857: getting the next task for host managed_node2 15247 1726867238.01861: done getting next task for host managed_node2 15247 1726867238.01864: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15247 1726867238.01865: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867238.01875: getting variables 15247 1726867238.01879: in VariableManager get_vars() 15247 1726867238.01893: Calling all_inventory to load vars for managed_node2 15247 1726867238.01895: Calling groups_inventory to load vars for managed_node2 15247 1726867238.01897: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867238.01901: Calling all_plugins_play to load vars for managed_node2 15247 1726867238.01906: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867238.01910: Calling groups_plugins_play to load vars for managed_node2 15247 1726867238.02243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867238.02656: done with get_vars() 15247 1726867238.02664: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:20:38 -0400 (0:00:00.901) 0:00:07.736 ****** 15247 1726867238.02738: entering _queue_task() for managed_node2/include_tasks 15247 1726867238.03329: worker is 1 (out of 1 available) 15247 1726867238.03340: exiting _queue_task() for managed_node2/include_tasks 15247 1726867238.03350: done queuing things up, now waiting for results queue to drain 15247 1726867238.03352: waiting for pending results... 15247 1726867238.03745: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15247 1726867238.03918: in run() - task 0affcac9-a3a5-8ce3-1923-000000000014 15247 1726867238.03943: variable 'ansible_search_path' from source: unknown 15247 1726867238.04023: variable 'ansible_search_path' from source: unknown 15247 1726867238.04065: calling self._execute() 15247 1726867238.04257: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867238.04271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867238.04290: variable 'omit' from source: magic vars 15247 1726867238.04988: variable 'ansible_distribution_major_version' from source: facts 15247 1726867238.05095: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867238.05111: _execute() done 15247 1726867238.05119: dumping result to json 15247 1726867238.05126: done dumping result, returning 15247 1726867238.05137: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-8ce3-1923-000000000014] 15247 1726867238.05147: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000014 15247 1726867238.05278: no more pending results, returning what we have 15247 1726867238.05284: in VariableManager get_vars() 15247 1726867238.05326: Calling all_inventory to load vars for managed_node2 15247 1726867238.05329: Calling groups_inventory to load vars for managed_node2 15247 1726867238.05332: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867238.05345: Calling all_plugins_play to load vars for managed_node2 15247 1726867238.05348: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867238.05350: Calling groups_plugins_play to load vars for managed_node2 15247 1726867238.05728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867238.06124: done with get_vars() 15247 1726867238.06132: variable 'ansible_search_path' from source: unknown 15247 1726867238.06133: variable 'ansible_search_path' from source: unknown 15247 1726867238.06145: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000014 15247 1726867238.06148: WORKER PROCESS EXITING 15247 1726867238.06167: we have included files to process 15247 1726867238.06168: generating all_blocks data 15247 1726867238.06170: done generating all_blocks data 15247 1726867238.06170: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15247 1726867238.06171: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15247 1726867238.06174: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15247 1726867238.07698: done processing included file 15247 1726867238.07700: iterating over new_blocks loaded from include file 15247 1726867238.07702: in VariableManager get_vars() 15247 1726867238.07724: done with get_vars() 15247 1726867238.07726: filtering new block on tags 15247 1726867238.07740: done filtering new block on tags 15247 1726867238.07743: in VariableManager get_vars() 15247 1726867238.07760: done with get_vars() 15247 1726867238.07761: filtering new block on tags 15247 1726867238.07781: done filtering new block on tags 15247 1726867238.07784: in VariableManager get_vars() 15247 1726867238.07802: done with get_vars() 15247 1726867238.07806: filtering new block on tags 15247 1726867238.07822: done filtering new block on tags 15247 1726867238.07825: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 15247 1726867238.07829: extending task lists for all hosts with included blocks 15247 1726867238.08625: done extending task lists 15247 1726867238.08626: done processing included files 15247 1726867238.08627: results queue empty 15247 1726867238.08628: checking for any_errors_fatal 15247 1726867238.08629: done checking for any_errors_fatal 15247 1726867238.08630: checking for max_fail_percentage 15247 1726867238.08631: done checking for max_fail_percentage 15247 1726867238.08631: checking to see if all hosts have failed and the running result is not ok 15247 1726867238.08632: done checking to see if all hosts have failed 15247 1726867238.08633: getting the remaining hosts for this loop 15247 1726867238.08634: done getting the remaining hosts for this loop 15247 1726867238.08637: getting the next task for host managed_node2 15247 1726867238.08640: done getting next task for host managed_node2 15247 1726867238.08643: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15247 1726867238.08645: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867238.08653: getting variables 15247 1726867238.08654: in VariableManager get_vars() 15247 1726867238.08667: Calling all_inventory to load vars for managed_node2 15247 1726867238.08669: Calling groups_inventory to load vars for managed_node2 15247 1726867238.08671: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867238.08675: Calling all_plugins_play to load vars for managed_node2 15247 1726867238.08882: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867238.08887: Calling groups_plugins_play to load vars for managed_node2 15247 1726867238.09050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867238.09348: done with get_vars() 15247 1726867238.09358: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:20:38 -0400 (0:00:00.068) 0:00:07.805 ****** 15247 1726867238.09630: entering _queue_task() for managed_node2/setup 15247 1726867238.10187: worker is 1 (out of 1 available) 15247 1726867238.10199: exiting _queue_task() for managed_node2/setup 15247 1726867238.10324: done queuing things up, now waiting for results queue to drain 15247 1726867238.10326: waiting for pending results... 15247 1726867238.10894: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15247 1726867238.10899: in run() - task 0affcac9-a3a5-8ce3-1923-00000000018d 15247 1726867238.10901: variable 'ansible_search_path' from source: unknown 15247 1726867238.10906: variable 'ansible_search_path' from source: unknown 15247 1726867238.10909: calling self._execute() 15247 1726867238.11035: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867238.11193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867238.11211: variable 'omit' from source: magic vars 15247 1726867238.11766: variable 'ansible_distribution_major_version' from source: facts 15247 1726867238.12182: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867238.12403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867238.16514: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867238.16983: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867238.16987: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867238.16990: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867238.16992: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867238.16995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867238.17202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867238.17236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867238.17282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867238.17306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867238.17363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867238.17393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867238.17611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867238.17657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867238.17676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867238.18044: variable '__network_required_facts' from source: role '' defaults 15247 1726867238.18061: variable 'ansible_facts' from source: unknown 15247 1726867238.18159: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15247 1726867238.18167: when evaluation is False, skipping this task 15247 1726867238.18175: _execute() done 15247 1726867238.18184: dumping result to json 15247 1726867238.18191: done dumping result, returning 15247 1726867238.18586: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-8ce3-1923-00000000018d] 15247 1726867238.18589: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000018d 15247 1726867238.18657: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000018d 15247 1726867238.18660: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15247 1726867238.18732: no more pending results, returning what we have 15247 1726867238.18736: results queue empty 15247 1726867238.18739: checking for any_errors_fatal 15247 1726867238.18741: done checking for any_errors_fatal 15247 1726867238.18741: checking for max_fail_percentage 15247 1726867238.18743: done checking for max_fail_percentage 15247 1726867238.18744: checking to see if all hosts have failed and the running result is not ok 15247 1726867238.18744: done checking to see if all hosts have failed 15247 1726867238.18745: getting the remaining hosts for this loop 15247 1726867238.18747: done getting the remaining hosts for this loop 15247 1726867238.18751: getting the next task for host managed_node2 15247 1726867238.18759: done getting next task for host managed_node2 15247 1726867238.18762: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15247 1726867238.18765: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867238.18782: getting variables 15247 1726867238.18784: in VariableManager get_vars() 15247 1726867238.18823: Calling all_inventory to load vars for managed_node2 15247 1726867238.18826: Calling groups_inventory to load vars for managed_node2 15247 1726867238.18829: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867238.18839: Calling all_plugins_play to load vars for managed_node2 15247 1726867238.18842: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867238.18845: Calling groups_plugins_play to load vars for managed_node2 15247 1726867238.19238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867238.19864: done with get_vars() 15247 1726867238.19875: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:20:38 -0400 (0:00:00.104) 0:00:07.909 ****** 15247 1726867238.20149: entering _queue_task() for managed_node2/stat 15247 1726867238.20646: worker is 1 (out of 1 available) 15247 1726867238.20660: exiting _queue_task() for managed_node2/stat 15247 1726867238.20672: done queuing things up, now waiting for results queue to drain 15247 1726867238.20673: waiting for pending results... 15247 1726867238.21228: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 15247 1726867238.21351: in run() - task 0affcac9-a3a5-8ce3-1923-00000000018f 15247 1726867238.21371: variable 'ansible_search_path' from source: unknown 15247 1726867238.21380: variable 'ansible_search_path' from source: unknown 15247 1726867238.21420: calling self._execute() 15247 1726867238.21661: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867238.21675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867238.21694: variable 'omit' from source: magic vars 15247 1726867238.22445: variable 'ansible_distribution_major_version' from source: facts 15247 1726867238.22514: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867238.22665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867238.23408: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867238.23456: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867238.23882: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867238.23885: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867238.23888: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867238.23890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867238.23892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867238.23919: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867238.24173: variable '__network_is_ostree' from source: set_fact 15247 1726867238.24187: Evaluated conditional (not __network_is_ostree is defined): False 15247 1726867238.24288: when evaluation is False, skipping this task 15247 1726867238.24296: _execute() done 15247 1726867238.24320: dumping result to json 15247 1726867238.24329: done dumping result, returning 15247 1726867238.24345: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-8ce3-1923-00000000018f] 15247 1726867238.24384: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000018f skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15247 1726867238.24533: no more pending results, returning what we have 15247 1726867238.24536: results queue empty 15247 1726867238.24537: checking for any_errors_fatal 15247 1726867238.24541: done checking for any_errors_fatal 15247 1726867238.24542: checking for max_fail_percentage 15247 1726867238.24543: done checking for max_fail_percentage 15247 1726867238.24544: checking to see if all hosts have failed and the running result is not ok 15247 1726867238.24545: done checking to see if all hosts have failed 15247 1726867238.24545: getting the remaining hosts for this loop 15247 1726867238.24547: done getting the remaining hosts for this loop 15247 1726867238.24550: getting the next task for host managed_node2 15247 1726867238.24556: done getting next task for host managed_node2 15247 1726867238.24559: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15247 1726867238.24561: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867238.24573: getting variables 15247 1726867238.24575: in VariableManager get_vars() 15247 1726867238.24610: Calling all_inventory to load vars for managed_node2 15247 1726867238.24613: Calling groups_inventory to load vars for managed_node2 15247 1726867238.24615: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867238.24625: Calling all_plugins_play to load vars for managed_node2 15247 1726867238.24627: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867238.24629: Calling groups_plugins_play to load vars for managed_node2 15247 1726867238.25191: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000018f 15247 1726867238.25194: WORKER PROCESS EXITING 15247 1726867238.25259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867238.25663: done with get_vars() 15247 1726867238.25673: done getting variables 15247 1726867238.25731: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:20:38 -0400 (0:00:00.056) 0:00:07.967 ****** 15247 1726867238.25762: entering _queue_task() for managed_node2/set_fact 15247 1726867238.26521: worker is 1 (out of 1 available) 15247 1726867238.26532: exiting _queue_task() for managed_node2/set_fact 15247 1726867238.26542: done queuing things up, now waiting for results queue to drain 15247 1726867238.26543: waiting for pending results... 15247 1726867238.26820: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15247 1726867238.27029: in run() - task 0affcac9-a3a5-8ce3-1923-000000000190 15247 1726867238.27044: variable 'ansible_search_path' from source: unknown 15247 1726867238.27047: variable 'ansible_search_path' from source: unknown 15247 1726867238.27200: calling self._execute() 15247 1726867238.27386: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867238.27393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867238.27406: variable 'omit' from source: magic vars 15247 1726867238.28180: variable 'ansible_distribution_major_version' from source: facts 15247 1726867238.28190: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867238.28859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867238.29583: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867238.29587: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867238.30186: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867238.30190: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867238.30193: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867238.30195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867238.30197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867238.30783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867238.30786: variable '__network_is_ostree' from source: set_fact 15247 1726867238.30788: Evaluated conditional (not __network_is_ostree is defined): False 15247 1726867238.30790: when evaluation is False, skipping this task 15247 1726867238.30792: _execute() done 15247 1726867238.30794: dumping result to json 15247 1726867238.30796: done dumping result, returning 15247 1726867238.30799: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-8ce3-1923-000000000190] 15247 1726867238.30800: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000190 15247 1726867238.30863: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000190 15247 1726867238.30867: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15247 1726867238.30919: no more pending results, returning what we have 15247 1726867238.30922: results queue empty 15247 1726867238.30923: checking for any_errors_fatal 15247 1726867238.30931: done checking for any_errors_fatal 15247 1726867238.30932: checking for max_fail_percentage 15247 1726867238.30934: done checking for max_fail_percentage 15247 1726867238.30935: checking to see if all hosts have failed and the running result is not ok 15247 1726867238.30936: done checking to see if all hosts have failed 15247 1726867238.30936: getting the remaining hosts for this loop 15247 1726867238.30938: done getting the remaining hosts for this loop 15247 1726867238.30942: getting the next task for host managed_node2 15247 1726867238.30951: done getting next task for host managed_node2 15247 1726867238.30954: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15247 1726867238.30957: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867238.30969: getting variables 15247 1726867238.30971: in VariableManager get_vars() 15247 1726867238.31012: Calling all_inventory to load vars for managed_node2 15247 1726867238.31015: Calling groups_inventory to load vars for managed_node2 15247 1726867238.31017: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867238.31028: Calling all_plugins_play to load vars for managed_node2 15247 1726867238.31031: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867238.31034: Calling groups_plugins_play to load vars for managed_node2 15247 1726867238.31406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867238.31803: done with get_vars() 15247 1726867238.31815: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:20:38 -0400 (0:00:00.063) 0:00:08.030 ****** 15247 1726867238.32103: entering _queue_task() for managed_node2/service_facts 15247 1726867238.32107: Creating lock for service_facts 15247 1726867238.32447: worker is 1 (out of 1 available) 15247 1726867238.32459: exiting _queue_task() for managed_node2/service_facts 15247 1726867238.32471: done queuing things up, now waiting for results queue to drain 15247 1726867238.32472: waiting for pending results... 15247 1726867238.33493: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 15247 1726867238.33498: in run() - task 0affcac9-a3a5-8ce3-1923-000000000192 15247 1726867238.33501: variable 'ansible_search_path' from source: unknown 15247 1726867238.33504: variable 'ansible_search_path' from source: unknown 15247 1726867238.33506: calling self._execute() 15247 1726867238.34083: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867238.34087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867238.34090: variable 'omit' from source: magic vars 15247 1726867238.34646: variable 'ansible_distribution_major_version' from source: facts 15247 1726867238.35082: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867238.35085: variable 'omit' from source: magic vars 15247 1726867238.35087: variable 'omit' from source: magic vars 15247 1726867238.35089: variable 'omit' from source: magic vars 15247 1726867238.35092: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867238.35882: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867238.35886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867238.35888: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867238.35890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867238.35893: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867238.35895: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867238.35897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867238.36482: Set connection var ansible_shell_executable to /bin/sh 15247 1726867238.36488: Set connection var ansible_connection to ssh 15247 1726867238.36491: Set connection var ansible_shell_type to sh 15247 1726867238.36493: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867238.36495: Set connection var ansible_timeout to 10 15247 1726867238.36497: Set connection var ansible_pipelining to False 15247 1726867238.36499: variable 'ansible_shell_executable' from source: unknown 15247 1726867238.36502: variable 'ansible_connection' from source: unknown 15247 1726867238.36507: variable 'ansible_module_compression' from source: unknown 15247 1726867238.36509: variable 'ansible_shell_type' from source: unknown 15247 1726867238.36510: variable 'ansible_shell_executable' from source: unknown 15247 1726867238.36512: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867238.36515: variable 'ansible_pipelining' from source: unknown 15247 1726867238.36516: variable 'ansible_timeout' from source: unknown 15247 1726867238.36518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867238.36928: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867238.36937: variable 'omit' from source: magic vars 15247 1726867238.36942: starting attempt loop 15247 1726867238.36945: running the handler 15247 1726867238.36959: _low_level_execute_command(): starting 15247 1726867238.36966: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867238.38209: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867238.38214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867238.38225: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867238.38491: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867238.40213: stdout chunk (state=3): >>>/root <<< 15247 1726867238.40306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867238.40504: stderr chunk (state=3): >>><<< 15247 1726867238.40507: stdout chunk (state=3): >>><<< 15247 1726867238.40525: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867238.40543: _low_level_execute_command(): starting 15247 1726867238.40553: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867238.4053137-15743-277840220172504 `" && echo ansible-tmp-1726867238.4053137-15743-277840220172504="` echo /root/.ansible/tmp/ansible-tmp-1726867238.4053137-15743-277840220172504 `" ) && sleep 0' 15247 1726867238.41724: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867238.41733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867238.41748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867238.41763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867238.41775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867238.41969: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867238.41986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867238.41996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867238.42066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867238.44072: stdout chunk (state=3): >>>ansible-tmp-1726867238.4053137-15743-277840220172504=/root/.ansible/tmp/ansible-tmp-1726867238.4053137-15743-277840220172504 <<< 15247 1726867238.44213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867238.44223: stdout chunk (state=3): >>><<< 15247 1726867238.44234: stderr chunk (state=3): >>><<< 15247 1726867238.44252: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867238.4053137-15743-277840220172504=/root/.ansible/tmp/ansible-tmp-1726867238.4053137-15743-277840220172504 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867238.44306: variable 'ansible_module_compression' from source: unknown 15247 1726867238.44583: ANSIBALLZ: Using lock for service_facts 15247 1726867238.44586: ANSIBALLZ: Acquiring lock 15247 1726867238.44588: ANSIBALLZ: Lock acquired: 140393876729888 15247 1726867238.44590: ANSIBALLZ: Creating module 15247 1726867238.67283: ANSIBALLZ: Writing module into payload 15247 1726867238.67388: ANSIBALLZ: Writing module 15247 1726867238.67422: ANSIBALLZ: Renaming module 15247 1726867238.67443: ANSIBALLZ: Done creating module 15247 1726867238.67465: variable 'ansible_facts' from source: unknown 15247 1726867238.67552: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867238.4053137-15743-277840220172504/AnsiballZ_service_facts.py 15247 1726867238.67703: Sending initial data 15247 1726867238.67801: Sent initial data (162 bytes) 15247 1726867238.68386: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867238.68466: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867238.68519: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867238.68547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867238.68619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867238.68634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867238.70317: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867238.70363: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867238.70410: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpa5oadmnq /root/.ansible/tmp/ansible-tmp-1726867238.4053137-15743-277840220172504/AnsiballZ_service_facts.py <<< 15247 1726867238.70413: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867238.4053137-15743-277840220172504/AnsiballZ_service_facts.py" <<< 15247 1726867238.70785: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpa5oadmnq" to remote "/root/.ansible/tmp/ansible-tmp-1726867238.4053137-15743-277840220172504/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867238.4053137-15743-277840220172504/AnsiballZ_service_facts.py" <<< 15247 1726867238.71691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867238.71694: stdout chunk (state=3): >>><<< 15247 1726867238.71697: stderr chunk (state=3): >>><<< 15247 1726867238.71708: done transferring module to remote 15247 1726867238.71724: _low_level_execute_command(): starting 15247 1726867238.71732: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867238.4053137-15743-277840220172504/ /root/.ansible/tmp/ansible-tmp-1726867238.4053137-15743-277840220172504/AnsiballZ_service_facts.py && sleep 0' 15247 1726867238.73308: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867238.73875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867238.73994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867238.74195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867238.76000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867238.76004: stderr chunk (state=3): >>><<< 15247 1726867238.76006: stdout chunk (state=3): >>><<< 15247 1726867238.76008: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867238.76011: _low_level_execute_command(): starting 15247 1726867238.76013: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867238.4053137-15743-277840220172504/AnsiballZ_service_facts.py && sleep 0' 15247 1726867238.77082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867238.77295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867238.77375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867240.34092: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 15247 1726867240.34126: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 15247 1726867240.34154: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 15247 1726867240.34195: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15247 1726867240.35884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867240.35888: stdout chunk (state=3): >>><<< 15247 1726867240.35890: stderr chunk (state=3): >>><<< 15247 1726867240.35898: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867240.37162: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867238.4053137-15743-277840220172504/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867240.37179: _low_level_execute_command(): starting 15247 1726867240.37190: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867238.4053137-15743-277840220172504/ > /dev/null 2>&1 && sleep 0' 15247 1726867240.38311: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867240.38394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867240.38414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867240.38435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867240.38491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867240.38760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867240.38799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867240.40868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867240.40871: stdout chunk (state=3): >>><<< 15247 1726867240.40874: stderr chunk (state=3): >>><<< 15247 1726867240.41082: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867240.41086: handler run complete 15247 1726867240.41267: variable 'ansible_facts' from source: unknown 15247 1726867240.41425: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867240.42459: variable 'ansible_facts' from source: unknown 15247 1726867240.42608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867240.42807: attempt loop complete, returning result 15247 1726867240.42817: _execute() done 15247 1726867240.42824: dumping result to json 15247 1726867240.42886: done dumping result, returning 15247 1726867240.42900: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-8ce3-1923-000000000192] 15247 1726867240.42918: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000192 15247 1726867240.44858: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000192 15247 1726867240.44861: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15247 1726867240.44962: no more pending results, returning what we have 15247 1726867240.44965: results queue empty 15247 1726867240.44966: checking for any_errors_fatal 15247 1726867240.44971: done checking for any_errors_fatal 15247 1726867240.44972: checking for max_fail_percentage 15247 1726867240.44973: done checking for max_fail_percentage 15247 1726867240.44974: checking to see if all hosts have failed and the running result is not ok 15247 1726867240.44975: done checking to see if all hosts have failed 15247 1726867240.44976: getting the remaining hosts for this loop 15247 1726867240.44978: done getting the remaining hosts for this loop 15247 1726867240.44981: getting the next task for host managed_node2 15247 1726867240.44986: done getting next task for host managed_node2 15247 1726867240.44990: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15247 1726867240.44992: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867240.45002: getting variables 15247 1726867240.45007: in VariableManager get_vars() 15247 1726867240.45037: Calling all_inventory to load vars for managed_node2 15247 1726867240.45039: Calling groups_inventory to load vars for managed_node2 15247 1726867240.45042: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867240.45051: Calling all_plugins_play to load vars for managed_node2 15247 1726867240.45053: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867240.45056: Calling groups_plugins_play to load vars for managed_node2 15247 1726867240.46260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867240.47997: done with get_vars() 15247 1726867240.48133: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:20:40 -0400 (0:00:02.163) 0:00:10.193 ****** 15247 1726867240.48428: entering _queue_task() for managed_node2/package_facts 15247 1726867240.48430: Creating lock for package_facts 15247 1726867240.49498: worker is 1 (out of 1 available) 15247 1726867240.49510: exiting _queue_task() for managed_node2/package_facts 15247 1726867240.49520: done queuing things up, now waiting for results queue to drain 15247 1726867240.49522: waiting for pending results... 15247 1726867240.49829: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 15247 1726867240.50084: in run() - task 0affcac9-a3a5-8ce3-1923-000000000193 15247 1726867240.50088: variable 'ansible_search_path' from source: unknown 15247 1726867240.50092: variable 'ansible_search_path' from source: unknown 15247 1726867240.50112: calling self._execute() 15247 1726867240.50486: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867240.50490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867240.50493: variable 'omit' from source: magic vars 15247 1726867240.51083: variable 'ansible_distribution_major_version' from source: facts 15247 1726867240.51099: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867240.51110: variable 'omit' from source: magic vars 15247 1726867240.51165: variable 'omit' from source: magic vars 15247 1726867240.51319: variable 'omit' from source: magic vars 15247 1726867240.51390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867240.51518: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867240.51543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867240.51605: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867240.51621: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867240.51657: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867240.51883: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867240.51886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867240.52183: Set connection var ansible_shell_executable to /bin/sh 15247 1726867240.52186: Set connection var ansible_connection to ssh 15247 1726867240.52188: Set connection var ansible_shell_type to sh 15247 1726867240.52190: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867240.52192: Set connection var ansible_timeout to 10 15247 1726867240.52194: Set connection var ansible_pipelining to False 15247 1726867240.52196: variable 'ansible_shell_executable' from source: unknown 15247 1726867240.52197: variable 'ansible_connection' from source: unknown 15247 1726867240.52200: variable 'ansible_module_compression' from source: unknown 15247 1726867240.52201: variable 'ansible_shell_type' from source: unknown 15247 1726867240.52203: variable 'ansible_shell_executable' from source: unknown 15247 1726867240.52205: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867240.52206: variable 'ansible_pipelining' from source: unknown 15247 1726867240.52208: variable 'ansible_timeout' from source: unknown 15247 1726867240.52210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867240.52413: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867240.52507: variable 'omit' from source: magic vars 15247 1726867240.52520: starting attempt loop 15247 1726867240.52526: running the handler 15247 1726867240.52542: _low_level_execute_command(): starting 15247 1726867240.52555: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867240.53213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867240.53318: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867240.53332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867240.53347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867240.53419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867240.55102: stdout chunk (state=3): >>>/root <<< 15247 1726867240.55307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867240.55311: stdout chunk (state=3): >>><<< 15247 1726867240.55313: stderr chunk (state=3): >>><<< 15247 1726867240.55488: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867240.55492: _low_level_execute_command(): starting 15247 1726867240.55496: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867240.5539763-15832-264292730604818 `" && echo ansible-tmp-1726867240.5539763-15832-264292730604818="` echo /root/.ansible/tmp/ansible-tmp-1726867240.5539763-15832-264292730604818 `" ) && sleep 0' 15247 1726867240.56679: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867240.56693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867240.56706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867240.56902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867240.56907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867240.56966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867240.58962: stdout chunk (state=3): >>>ansible-tmp-1726867240.5539763-15832-264292730604818=/root/.ansible/tmp/ansible-tmp-1726867240.5539763-15832-264292730604818 <<< 15247 1726867240.59099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867240.59200: stderr chunk (state=3): >>><<< 15247 1726867240.59204: stdout chunk (state=3): >>><<< 15247 1726867240.59217: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867240.5539763-15832-264292730604818=/root/.ansible/tmp/ansible-tmp-1726867240.5539763-15832-264292730604818 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867240.59272: variable 'ansible_module_compression' from source: unknown 15247 1726867240.59328: ANSIBALLZ: Using lock for package_facts 15247 1726867240.59683: ANSIBALLZ: Acquiring lock 15247 1726867240.59687: ANSIBALLZ: Lock acquired: 140393874360512 15247 1726867240.59689: ANSIBALLZ: Creating module 15247 1726867240.93450: ANSIBALLZ: Writing module into payload 15247 1726867240.93582: ANSIBALLZ: Writing module 15247 1726867240.93613: ANSIBALLZ: Renaming module 15247 1726867240.93617: ANSIBALLZ: Done creating module 15247 1726867240.93647: variable 'ansible_facts' from source: unknown 15247 1726867240.93768: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867240.5539763-15832-264292730604818/AnsiballZ_package_facts.py 15247 1726867240.93883: Sending initial data 15247 1726867240.93886: Sent initial data (162 bytes) 15247 1726867240.94522: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867240.94537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867240.94613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867240.96271: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15247 1726867240.96287: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867240.96328: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867240.96363: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmppt_9epgx /root/.ansible/tmp/ansible-tmp-1726867240.5539763-15832-264292730604818/AnsiballZ_package_facts.py <<< 15247 1726867240.96373: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867240.5539763-15832-264292730604818/AnsiballZ_package_facts.py" <<< 15247 1726867240.96405: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmppt_9epgx" to remote "/root/.ansible/tmp/ansible-tmp-1726867240.5539763-15832-264292730604818/AnsiballZ_package_facts.py" <<< 15247 1726867240.96411: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867240.5539763-15832-264292730604818/AnsiballZ_package_facts.py" <<< 15247 1726867240.97497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867240.97554: stderr chunk (state=3): >>><<< 15247 1726867240.97563: stdout chunk (state=3): >>><<< 15247 1726867240.97566: done transferring module to remote 15247 1726867240.97569: _low_level_execute_command(): starting 15247 1726867240.97586: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867240.5539763-15832-264292730604818/ /root/.ansible/tmp/ansible-tmp-1726867240.5539763-15832-264292730604818/AnsiballZ_package_facts.py && sleep 0' 15247 1726867240.98040: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867240.98043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867240.98046: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867240.98048: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867240.98051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867240.98102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867240.98108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867240.98165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867240.99968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867240.99990: stderr chunk (state=3): >>><<< 15247 1726867240.99994: stdout chunk (state=3): >>><<< 15247 1726867241.00008: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867241.00011: _low_level_execute_command(): starting 15247 1726867241.00014: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867240.5539763-15832-264292730604818/AnsiballZ_package_facts.py && sleep 0' 15247 1726867241.00573: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867241.00582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867241.00585: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867241.00587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867241.00589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867241.00634: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867241.00641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867241.00683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867241.44915: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 15247 1726867241.44938: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 15247 1726867241.44956: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 15247 1726867241.45017: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 15247 1726867241.45036: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 15247 1726867241.45076: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 15247 1726867241.45088: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 15247 1726867241.45103: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15247 1726867241.46821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867241.46846: stderr chunk (state=3): >>><<< 15247 1726867241.46849: stdout chunk (state=3): >>><<< 15247 1726867241.46886: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867241.48842: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867240.5539763-15832-264292730604818/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867241.48860: _low_level_execute_command(): starting 15247 1726867241.48863: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867240.5539763-15832-264292730604818/ > /dev/null 2>&1 && sleep 0' 15247 1726867241.49452: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867241.49455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867241.49459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867241.49462: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867241.49471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867241.49531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867241.49544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867241.49611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867241.51445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867241.51498: stderr chunk (state=3): >>><<< 15247 1726867241.51501: stdout chunk (state=3): >>><<< 15247 1726867241.51520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867241.51544: handler run complete 15247 1726867241.52319: variable 'ansible_facts' from source: unknown 15247 1726867241.52690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867241.54834: variable 'ansible_facts' from source: unknown 15247 1726867241.55101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867241.55615: attempt loop complete, returning result 15247 1726867241.55624: _execute() done 15247 1726867241.55627: dumping result to json 15247 1726867241.55764: done dumping result, returning 15247 1726867241.55780: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-8ce3-1923-000000000193] 15247 1726867241.55783: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000193 15247 1726867241.58396: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000193 15247 1726867241.58400: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15247 1726867241.58502: no more pending results, returning what we have 15247 1726867241.58505: results queue empty 15247 1726867241.58506: checking for any_errors_fatal 15247 1726867241.58513: done checking for any_errors_fatal 15247 1726867241.58513: checking for max_fail_percentage 15247 1726867241.58515: done checking for max_fail_percentage 15247 1726867241.58515: checking to see if all hosts have failed and the running result is not ok 15247 1726867241.58516: done checking to see if all hosts have failed 15247 1726867241.58517: getting the remaining hosts for this loop 15247 1726867241.58518: done getting the remaining hosts for this loop 15247 1726867241.58521: getting the next task for host managed_node2 15247 1726867241.58527: done getting next task for host managed_node2 15247 1726867241.58531: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15247 1726867241.58533: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867241.58541: getting variables 15247 1726867241.58542: in VariableManager get_vars() 15247 1726867241.58580: Calling all_inventory to load vars for managed_node2 15247 1726867241.58583: Calling groups_inventory to load vars for managed_node2 15247 1726867241.58586: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867241.58595: Calling all_plugins_play to load vars for managed_node2 15247 1726867241.58598: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867241.58601: Calling groups_plugins_play to load vars for managed_node2 15247 1726867241.59547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867241.60399: done with get_vars() 15247 1726867241.60416: done getting variables 15247 1726867241.60457: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:20:41 -0400 (0:00:01.120) 0:00:11.314 ****** 15247 1726867241.60476: entering _queue_task() for managed_node2/debug 15247 1726867241.60691: worker is 1 (out of 1 available) 15247 1726867241.60703: exiting _queue_task() for managed_node2/debug 15247 1726867241.60715: done queuing things up, now waiting for results queue to drain 15247 1726867241.60717: waiting for pending results... 15247 1726867241.61179: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 15247 1726867241.61334: in run() - task 0affcac9-a3a5-8ce3-1923-000000000015 15247 1726867241.61412: variable 'ansible_search_path' from source: unknown 15247 1726867241.61444: variable 'ansible_search_path' from source: unknown 15247 1726867241.61567: calling self._execute() 15247 1726867241.61683: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867241.61804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867241.61845: variable 'omit' from source: magic vars 15247 1726867241.62313: variable 'ansible_distribution_major_version' from source: facts 15247 1726867241.62317: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867241.62319: variable 'omit' from source: magic vars 15247 1726867241.62348: variable 'omit' from source: magic vars 15247 1726867241.62475: variable 'network_provider' from source: set_fact 15247 1726867241.62481: variable 'omit' from source: magic vars 15247 1726867241.62512: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867241.62554: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867241.62585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867241.62615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867241.62619: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867241.62682: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867241.62691: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867241.62694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867241.62791: Set connection var ansible_shell_executable to /bin/sh 15247 1726867241.62823: Set connection var ansible_connection to ssh 15247 1726867241.62826: Set connection var ansible_shell_type to sh 15247 1726867241.62828: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867241.62830: Set connection var ansible_timeout to 10 15247 1726867241.62832: Set connection var ansible_pipelining to False 15247 1726867241.62855: variable 'ansible_shell_executable' from source: unknown 15247 1726867241.62859: variable 'ansible_connection' from source: unknown 15247 1726867241.62861: variable 'ansible_module_compression' from source: unknown 15247 1726867241.62863: variable 'ansible_shell_type' from source: unknown 15247 1726867241.62866: variable 'ansible_shell_executable' from source: unknown 15247 1726867241.62867: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867241.62869: variable 'ansible_pipelining' from source: unknown 15247 1726867241.62871: variable 'ansible_timeout' from source: unknown 15247 1726867241.62873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867241.63003: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867241.63019: variable 'omit' from source: magic vars 15247 1726867241.63022: starting attempt loop 15247 1726867241.63025: running the handler 15247 1726867241.63053: handler run complete 15247 1726867241.63063: attempt loop complete, returning result 15247 1726867241.63066: _execute() done 15247 1726867241.63073: dumping result to json 15247 1726867241.63076: done dumping result, returning 15247 1726867241.63092: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-8ce3-1923-000000000015] 15247 1726867241.63111: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000015 15247 1726867241.63194: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000015 15247 1726867241.63197: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 15247 1726867241.63285: no more pending results, returning what we have 15247 1726867241.63288: results queue empty 15247 1726867241.63289: checking for any_errors_fatal 15247 1726867241.63296: done checking for any_errors_fatal 15247 1726867241.63296: checking for max_fail_percentage 15247 1726867241.63298: done checking for max_fail_percentage 15247 1726867241.63298: checking to see if all hosts have failed and the running result is not ok 15247 1726867241.63299: done checking to see if all hosts have failed 15247 1726867241.63300: getting the remaining hosts for this loop 15247 1726867241.63301: done getting the remaining hosts for this loop 15247 1726867241.63309: getting the next task for host managed_node2 15247 1726867241.63314: done getting next task for host managed_node2 15247 1726867241.63317: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15247 1726867241.63319: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867241.63327: getting variables 15247 1726867241.63329: in VariableManager get_vars() 15247 1726867241.63359: Calling all_inventory to load vars for managed_node2 15247 1726867241.63361: Calling groups_inventory to load vars for managed_node2 15247 1726867241.63363: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867241.63371: Calling all_plugins_play to load vars for managed_node2 15247 1726867241.63373: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867241.63376: Calling groups_plugins_play to load vars for managed_node2 15247 1726867241.65476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867241.67341: done with get_vars() 15247 1726867241.67366: done getting variables 15247 1726867241.67473: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:20:41 -0400 (0:00:00.070) 0:00:11.384 ****** 15247 1726867241.67505: entering _queue_task() for managed_node2/fail 15247 1726867241.67507: Creating lock for fail 15247 1726867241.68026: worker is 1 (out of 1 available) 15247 1726867241.68039: exiting _queue_task() for managed_node2/fail 15247 1726867241.68097: done queuing things up, now waiting for results queue to drain 15247 1726867241.68099: waiting for pending results... 15247 1726867241.68510: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15247 1726867241.68581: in run() - task 0affcac9-a3a5-8ce3-1923-000000000016 15247 1726867241.68631: variable 'ansible_search_path' from source: unknown 15247 1726867241.68636: variable 'ansible_search_path' from source: unknown 15247 1726867241.68664: calling self._execute() 15247 1726867241.68787: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867241.68849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867241.68854: variable 'omit' from source: magic vars 15247 1726867241.69367: variable 'ansible_distribution_major_version' from source: facts 15247 1726867241.69409: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867241.69541: variable 'network_state' from source: role '' defaults 15247 1726867241.69682: Evaluated conditional (network_state != {}): False 15247 1726867241.69686: when evaluation is False, skipping this task 15247 1726867241.69689: _execute() done 15247 1726867241.69691: dumping result to json 15247 1726867241.69693: done dumping result, returning 15247 1726867241.69696: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-8ce3-1923-000000000016] 15247 1726867241.69698: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000016 15247 1726867241.69761: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000016 15247 1726867241.69764: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15247 1726867241.69814: no more pending results, returning what we have 15247 1726867241.69818: results queue empty 15247 1726867241.69819: checking for any_errors_fatal 15247 1726867241.69825: done checking for any_errors_fatal 15247 1726867241.69826: checking for max_fail_percentage 15247 1726867241.69828: done checking for max_fail_percentage 15247 1726867241.69828: checking to see if all hosts have failed and the running result is not ok 15247 1726867241.69829: done checking to see if all hosts have failed 15247 1726867241.69830: getting the remaining hosts for this loop 15247 1726867241.69831: done getting the remaining hosts for this loop 15247 1726867241.69834: getting the next task for host managed_node2 15247 1726867241.69840: done getting next task for host managed_node2 15247 1726867241.69843: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15247 1726867241.69846: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867241.69859: getting variables 15247 1726867241.69861: in VariableManager get_vars() 15247 1726867241.69933: Calling all_inventory to load vars for managed_node2 15247 1726867241.69935: Calling groups_inventory to load vars for managed_node2 15247 1726867241.69938: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867241.69950: Calling all_plugins_play to load vars for managed_node2 15247 1726867241.69953: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867241.69956: Calling groups_plugins_play to load vars for managed_node2 15247 1726867241.72121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867241.75060: done with get_vars() 15247 1726867241.75091: done getting variables 15247 1726867241.75157: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:20:41 -0400 (0:00:00.076) 0:00:11.461 ****** 15247 1726867241.75191: entering _queue_task() for managed_node2/fail 15247 1726867241.75663: worker is 1 (out of 1 available) 15247 1726867241.75675: exiting _queue_task() for managed_node2/fail 15247 1726867241.75689: done queuing things up, now waiting for results queue to drain 15247 1726867241.75691: waiting for pending results... 15247 1726867241.76110: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15247 1726867241.76118: in run() - task 0affcac9-a3a5-8ce3-1923-000000000017 15247 1726867241.76121: variable 'ansible_search_path' from source: unknown 15247 1726867241.76124: variable 'ansible_search_path' from source: unknown 15247 1726867241.76127: calling self._execute() 15247 1726867241.76216: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867241.76226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867241.76307: variable 'omit' from source: magic vars 15247 1726867241.76646: variable 'ansible_distribution_major_version' from source: facts 15247 1726867241.76659: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867241.77272: variable 'network_state' from source: role '' defaults 15247 1726867241.77288: Evaluated conditional (network_state != {}): False 15247 1726867241.77291: when evaluation is False, skipping this task 15247 1726867241.77294: _execute() done 15247 1726867241.77297: dumping result to json 15247 1726867241.77305: done dumping result, returning 15247 1726867241.77326: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-8ce3-1923-000000000017] 15247 1726867241.77370: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000017 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15247 1726867241.77772: no more pending results, returning what we have 15247 1726867241.77775: results queue empty 15247 1726867241.77776: checking for any_errors_fatal 15247 1726867241.77783: done checking for any_errors_fatal 15247 1726867241.77783: checking for max_fail_percentage 15247 1726867241.77787: done checking for max_fail_percentage 15247 1726867241.77787: checking to see if all hosts have failed and the running result is not ok 15247 1726867241.77788: done checking to see if all hosts have failed 15247 1726867241.77789: getting the remaining hosts for this loop 15247 1726867241.77790: done getting the remaining hosts for this loop 15247 1726867241.77793: getting the next task for host managed_node2 15247 1726867241.77798: done getting next task for host managed_node2 15247 1726867241.77802: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15247 1726867241.77804: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867241.77817: getting variables 15247 1726867241.77819: in VariableManager get_vars() 15247 1726867241.77852: Calling all_inventory to load vars for managed_node2 15247 1726867241.77854: Calling groups_inventory to load vars for managed_node2 15247 1726867241.77858: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867241.77867: Calling all_plugins_play to load vars for managed_node2 15247 1726867241.77871: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867241.77875: Calling groups_plugins_play to load vars for managed_node2 15247 1726867241.78490: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000017 15247 1726867241.78495: WORKER PROCESS EXITING 15247 1726867241.79322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867241.81604: done with get_vars() 15247 1726867241.81624: done getting variables 15247 1726867241.81796: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:20:41 -0400 (0:00:00.066) 0:00:11.527 ****** 15247 1726867241.81833: entering _queue_task() for managed_node2/fail 15247 1726867241.82486: worker is 1 (out of 1 available) 15247 1726867241.82498: exiting _queue_task() for managed_node2/fail 15247 1726867241.82510: done queuing things up, now waiting for results queue to drain 15247 1726867241.82511: waiting for pending results... 15247 1726867241.83104: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15247 1726867241.83221: in run() - task 0affcac9-a3a5-8ce3-1923-000000000018 15247 1726867241.83230: variable 'ansible_search_path' from source: unknown 15247 1726867241.83237: variable 'ansible_search_path' from source: unknown 15247 1726867241.83275: calling self._execute() 15247 1726867241.83533: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867241.83537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867241.83545: variable 'omit' from source: magic vars 15247 1726867241.84359: variable 'ansible_distribution_major_version' from source: facts 15247 1726867241.84402: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867241.84637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867241.88255: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867241.88258: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867241.88386: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867241.88475: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867241.88516: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867241.88670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867241.88841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867241.89033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867241.89061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867241.89080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867241.89350: variable 'ansible_distribution_major_version' from source: facts 15247 1726867241.89457: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15247 1726867241.89689: variable 'ansible_distribution' from source: facts 15247 1726867241.89692: variable '__network_rh_distros' from source: role '' defaults 15247 1726867241.89695: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15247 1726867241.90159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867241.90197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867241.90235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867241.90284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867241.90314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867241.90371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867241.90441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867241.90512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867241.90633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867241.90657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867241.90720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867241.90756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867241.90793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867241.90882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867241.90889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867241.91242: variable 'network_connections' from source: play vars 15247 1726867241.91259: variable 'interface' from source: set_fact 15247 1726867241.91345: variable 'interface' from source: set_fact 15247 1726867241.91358: variable 'interface' from source: set_fact 15247 1726867241.91683: variable 'interface' from source: set_fact 15247 1726867241.91686: variable 'network_state' from source: role '' defaults 15247 1726867241.91696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867241.91772: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867241.91816: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867241.91852: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867241.91880: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867241.91924: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867241.92025: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867241.92030: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867241.92032: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867241.92035: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15247 1726867241.92038: when evaluation is False, skipping this task 15247 1726867241.92040: _execute() done 15247 1726867241.92042: dumping result to json 15247 1726867241.92044: done dumping result, returning 15247 1726867241.92047: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-8ce3-1923-000000000018] 15247 1726867241.92049: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000018 skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15247 1726867241.92281: no more pending results, returning what we have 15247 1726867241.92284: results queue empty 15247 1726867241.92285: checking for any_errors_fatal 15247 1726867241.92290: done checking for any_errors_fatal 15247 1726867241.92290: checking for max_fail_percentage 15247 1726867241.92292: done checking for max_fail_percentage 15247 1726867241.92292: checking to see if all hosts have failed and the running result is not ok 15247 1726867241.92293: done checking to see if all hosts have failed 15247 1726867241.92294: getting the remaining hosts for this loop 15247 1726867241.92295: done getting the remaining hosts for this loop 15247 1726867241.92299: getting the next task for host managed_node2 15247 1726867241.92304: done getting next task for host managed_node2 15247 1726867241.92309: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15247 1726867241.92311: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867241.92322: getting variables 15247 1726867241.92323: in VariableManager get_vars() 15247 1726867241.92388: Calling all_inventory to load vars for managed_node2 15247 1726867241.92391: Calling groups_inventory to load vars for managed_node2 15247 1726867241.92394: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867241.92403: Calling all_plugins_play to load vars for managed_node2 15247 1726867241.92406: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867241.92410: Calling groups_plugins_play to load vars for managed_node2 15247 1726867241.93673: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000018 15247 1726867241.94185: WORKER PROCESS EXITING 15247 1726867241.94749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867241.96514: done with get_vars() 15247 1726867241.96543: done getting variables 15247 1726867241.96651: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:20:41 -0400 (0:00:00.148) 0:00:11.676 ****** 15247 1726867241.96683: entering _queue_task() for managed_node2/dnf 15247 1726867241.97011: worker is 1 (out of 1 available) 15247 1726867241.97024: exiting _queue_task() for managed_node2/dnf 15247 1726867241.97037: done queuing things up, now waiting for results queue to drain 15247 1726867241.97038: waiting for pending results... 15247 1726867241.97316: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15247 1726867241.97483: in run() - task 0affcac9-a3a5-8ce3-1923-000000000019 15247 1726867241.97487: variable 'ansible_search_path' from source: unknown 15247 1726867241.97490: variable 'ansible_search_path' from source: unknown 15247 1726867241.97516: calling self._execute() 15247 1726867241.97599: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867241.97625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867241.97640: variable 'omit' from source: magic vars 15247 1726867241.98017: variable 'ansible_distribution_major_version' from source: facts 15247 1726867241.98032: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867241.98268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867242.00520: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867242.00602: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867242.00648: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867242.00697: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867242.00731: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867242.00860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.00863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.00886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.00934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.00945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.01033: variable 'ansible_distribution' from source: facts 15247 1726867242.01037: variable 'ansible_distribution_major_version' from source: facts 15247 1726867242.01048: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15247 1726867242.01129: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867242.01209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.01236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.01253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.01279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.01291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.01322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.01340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.01357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.01382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.01393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.01423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.01441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.01459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.01483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.01493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.01598: variable 'network_connections' from source: play vars 15247 1726867242.01607: variable 'interface' from source: set_fact 15247 1726867242.01658: variable 'interface' from source: set_fact 15247 1726867242.01665: variable 'interface' from source: set_fact 15247 1726867242.01712: variable 'interface' from source: set_fact 15247 1726867242.01757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867242.01879: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867242.01907: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867242.01931: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867242.01951: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867242.01983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867242.02001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867242.02025: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.02042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867242.02093: variable '__network_team_connections_defined' from source: role '' defaults 15247 1726867242.02242: variable 'network_connections' from source: play vars 15247 1726867242.02246: variable 'interface' from source: set_fact 15247 1726867242.02290: variable 'interface' from source: set_fact 15247 1726867242.02295: variable 'interface' from source: set_fact 15247 1726867242.02342: variable 'interface' from source: set_fact 15247 1726867242.02364: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15247 1726867242.02367: when evaluation is False, skipping this task 15247 1726867242.02370: _execute() done 15247 1726867242.02372: dumping result to json 15247 1726867242.02375: done dumping result, returning 15247 1726867242.02384: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-8ce3-1923-000000000019] 15247 1726867242.02390: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000019 15247 1726867242.02479: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000019 15247 1726867242.02482: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15247 1726867242.02527: no more pending results, returning what we have 15247 1726867242.02530: results queue empty 15247 1726867242.02531: checking for any_errors_fatal 15247 1726867242.02536: done checking for any_errors_fatal 15247 1726867242.02537: checking for max_fail_percentage 15247 1726867242.02539: done checking for max_fail_percentage 15247 1726867242.02540: checking to see if all hosts have failed and the running result is not ok 15247 1726867242.02540: done checking to see if all hosts have failed 15247 1726867242.02541: getting the remaining hosts for this loop 15247 1726867242.02542: done getting the remaining hosts for this loop 15247 1726867242.02546: getting the next task for host managed_node2 15247 1726867242.02551: done getting next task for host managed_node2 15247 1726867242.02555: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15247 1726867242.02556: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867242.02568: getting variables 15247 1726867242.02570: in VariableManager get_vars() 15247 1726867242.02606: Calling all_inventory to load vars for managed_node2 15247 1726867242.02609: Calling groups_inventory to load vars for managed_node2 15247 1726867242.02611: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867242.02620: Calling all_plugins_play to load vars for managed_node2 15247 1726867242.02622: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867242.02624: Calling groups_plugins_play to load vars for managed_node2 15247 1726867242.03420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867242.04838: done with get_vars() 15247 1726867242.04854: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15247 1726867242.04910: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:20:42 -0400 (0:00:00.082) 0:00:11.758 ****** 15247 1726867242.04930: entering _queue_task() for managed_node2/yum 15247 1726867242.04931: Creating lock for yum 15247 1726867242.05149: worker is 1 (out of 1 available) 15247 1726867242.05162: exiting _queue_task() for managed_node2/yum 15247 1726867242.05173: done queuing things up, now waiting for results queue to drain 15247 1726867242.05174: waiting for pending results... 15247 1726867242.05338: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15247 1726867242.05406: in run() - task 0affcac9-a3a5-8ce3-1923-00000000001a 15247 1726867242.05420: variable 'ansible_search_path' from source: unknown 15247 1726867242.05424: variable 'ansible_search_path' from source: unknown 15247 1726867242.05449: calling self._execute() 15247 1726867242.05510: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867242.05518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867242.05527: variable 'omit' from source: magic vars 15247 1726867242.05783: variable 'ansible_distribution_major_version' from source: facts 15247 1726867242.05792: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867242.05912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867242.07408: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867242.07469: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867242.07548: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867242.07551: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867242.07591: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867242.07783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.07787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.07789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.07791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.07793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.07835: variable 'ansible_distribution_major_version' from source: facts 15247 1726867242.07849: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15247 1726867242.07852: when evaluation is False, skipping this task 15247 1726867242.07854: _execute() done 15247 1726867242.07857: dumping result to json 15247 1726867242.07859: done dumping result, returning 15247 1726867242.07870: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-8ce3-1923-00000000001a] 15247 1726867242.07875: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000001a skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15247 1726867242.08122: no more pending results, returning what we have 15247 1726867242.08125: results queue empty 15247 1726867242.08126: checking for any_errors_fatal 15247 1726867242.08130: done checking for any_errors_fatal 15247 1726867242.08131: checking for max_fail_percentage 15247 1726867242.08133: done checking for max_fail_percentage 15247 1726867242.08133: checking to see if all hosts have failed and the running result is not ok 15247 1726867242.08134: done checking to see if all hosts have failed 15247 1726867242.08135: getting the remaining hosts for this loop 15247 1726867242.08143: done getting the remaining hosts for this loop 15247 1726867242.08146: getting the next task for host managed_node2 15247 1726867242.08151: done getting next task for host managed_node2 15247 1726867242.08155: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15247 1726867242.08157: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867242.08169: getting variables 15247 1726867242.08170: in VariableManager get_vars() 15247 1726867242.08215: Calling all_inventory to load vars for managed_node2 15247 1726867242.08218: Calling groups_inventory to load vars for managed_node2 15247 1726867242.08221: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867242.08227: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000001a 15247 1726867242.08230: WORKER PROCESS EXITING 15247 1726867242.08238: Calling all_plugins_play to load vars for managed_node2 15247 1726867242.08241: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867242.08250: Calling groups_plugins_play to load vars for managed_node2 15247 1726867242.09644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867242.11352: done with get_vars() 15247 1726867242.11373: done getting variables 15247 1726867242.11446: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:20:42 -0400 (0:00:00.065) 0:00:11.824 ****** 15247 1726867242.11476: entering _queue_task() for managed_node2/fail 15247 1726867242.11791: worker is 1 (out of 1 available) 15247 1726867242.11803: exiting _queue_task() for managed_node2/fail 15247 1726867242.11814: done queuing things up, now waiting for results queue to drain 15247 1726867242.11816: waiting for pending results... 15247 1726867242.12097: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15247 1726867242.12210: in run() - task 0affcac9-a3a5-8ce3-1923-00000000001b 15247 1726867242.12232: variable 'ansible_search_path' from source: unknown 15247 1726867242.12240: variable 'ansible_search_path' from source: unknown 15247 1726867242.12280: calling self._execute() 15247 1726867242.12375: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867242.12408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867242.12436: variable 'omit' from source: magic vars 15247 1726867242.12817: variable 'ansible_distribution_major_version' from source: facts 15247 1726867242.12844: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867242.13060: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867242.13190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867242.15530: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867242.15609: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867242.15651: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867242.15692: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867242.15735: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867242.15883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.15886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.15891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.15948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.15967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.16020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.16062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.16094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.16149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.16282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.16285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.16288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.16290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.16320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.16340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.16534: variable 'network_connections' from source: play vars 15247 1726867242.16552: variable 'interface' from source: set_fact 15247 1726867242.16644: variable 'interface' from source: set_fact 15247 1726867242.16659: variable 'interface' from source: set_fact 15247 1726867242.16725: variable 'interface' from source: set_fact 15247 1726867242.16814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867242.17350: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867242.17482: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867242.17485: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867242.17489: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867242.17522: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867242.17548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867242.17579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.17624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867242.17687: variable '__network_team_connections_defined' from source: role '' defaults 15247 1726867242.17958: variable 'network_connections' from source: play vars 15247 1726867242.17968: variable 'interface' from source: set_fact 15247 1726867242.18030: variable 'interface' from source: set_fact 15247 1726867242.18057: variable 'interface' from source: set_fact 15247 1726867242.18120: variable 'interface' from source: set_fact 15247 1726867242.18166: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15247 1726867242.18258: when evaluation is False, skipping this task 15247 1726867242.18262: _execute() done 15247 1726867242.18265: dumping result to json 15247 1726867242.18267: done dumping result, returning 15247 1726867242.18270: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-8ce3-1923-00000000001b] 15247 1726867242.18283: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000001b 15247 1726867242.18353: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000001b 15247 1726867242.18357: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15247 1726867242.18411: no more pending results, returning what we have 15247 1726867242.18415: results queue empty 15247 1726867242.18416: checking for any_errors_fatal 15247 1726867242.18421: done checking for any_errors_fatal 15247 1726867242.18422: checking for max_fail_percentage 15247 1726867242.18425: done checking for max_fail_percentage 15247 1726867242.18425: checking to see if all hosts have failed and the running result is not ok 15247 1726867242.18426: done checking to see if all hosts have failed 15247 1726867242.18427: getting the remaining hosts for this loop 15247 1726867242.18429: done getting the remaining hosts for this loop 15247 1726867242.18432: getting the next task for host managed_node2 15247 1726867242.18438: done getting next task for host managed_node2 15247 1726867242.18443: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15247 1726867242.18445: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867242.18459: getting variables 15247 1726867242.18461: in VariableManager get_vars() 15247 1726867242.18502: Calling all_inventory to load vars for managed_node2 15247 1726867242.18504: Calling groups_inventory to load vars for managed_node2 15247 1726867242.18507: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867242.18519: Calling all_plugins_play to load vars for managed_node2 15247 1726867242.18522: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867242.18525: Calling groups_plugins_play to load vars for managed_node2 15247 1726867242.20358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867242.22004: done with get_vars() 15247 1726867242.22025: done getting variables 15247 1726867242.22095: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:20:42 -0400 (0:00:00.106) 0:00:11.930 ****** 15247 1726867242.22124: entering _queue_task() for managed_node2/package 15247 1726867242.22605: worker is 1 (out of 1 available) 15247 1726867242.22616: exiting _queue_task() for managed_node2/package 15247 1726867242.22626: done queuing things up, now waiting for results queue to drain 15247 1726867242.22628: waiting for pending results... 15247 1726867242.22875: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 15247 1726867242.22882: in run() - task 0affcac9-a3a5-8ce3-1923-00000000001c 15247 1726867242.22885: variable 'ansible_search_path' from source: unknown 15247 1726867242.22887: variable 'ansible_search_path' from source: unknown 15247 1726867242.22889: calling self._execute() 15247 1726867242.22976: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867242.23087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867242.23092: variable 'omit' from source: magic vars 15247 1726867242.23379: variable 'ansible_distribution_major_version' from source: facts 15247 1726867242.23402: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867242.23609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867242.23898: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867242.23958: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867242.23997: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867242.24033: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867242.24153: variable 'network_packages' from source: role '' defaults 15247 1726867242.24298: variable '__network_provider_setup' from source: role '' defaults 15247 1726867242.24305: variable '__network_service_name_default_nm' from source: role '' defaults 15247 1726867242.24368: variable '__network_service_name_default_nm' from source: role '' defaults 15247 1726867242.24383: variable '__network_packages_default_nm' from source: role '' defaults 15247 1726867242.24427: variable '__network_packages_default_nm' from source: role '' defaults 15247 1726867242.24542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867242.25885: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867242.25939: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867242.25965: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867242.25989: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867242.26008: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867242.26069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.26091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.26112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.26142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.26152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.26185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.26201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.26222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.26250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.26260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.26403: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15247 1726867242.26475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.26493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.26511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.26536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.26546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.26610: variable 'ansible_python' from source: facts 15247 1726867242.26647: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15247 1726867242.26723: variable '__network_wpa_supplicant_required' from source: role '' defaults 15247 1726867242.26783: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15247 1726867242.26899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.26923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.26991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.26995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.26997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.27038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.27060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.27083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.27182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.27190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.27279: variable 'network_connections' from source: play vars 15247 1726867242.27284: variable 'interface' from source: set_fact 15247 1726867242.27419: variable 'interface' from source: set_fact 15247 1726867242.27422: variable 'interface' from source: set_fact 15247 1726867242.27499: variable 'interface' from source: set_fact 15247 1726867242.27563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867242.27601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867242.27621: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.27683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867242.27702: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867242.27926: variable 'network_connections' from source: play vars 15247 1726867242.27929: variable 'interface' from source: set_fact 15247 1726867242.28084: variable 'interface' from source: set_fact 15247 1726867242.28087: variable 'interface' from source: set_fact 15247 1726867242.28125: variable 'interface' from source: set_fact 15247 1726867242.28168: variable '__network_packages_default_wireless' from source: role '' defaults 15247 1726867242.28245: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867242.28543: variable 'network_connections' from source: play vars 15247 1726867242.28546: variable 'interface' from source: set_fact 15247 1726867242.28623: variable 'interface' from source: set_fact 15247 1726867242.28626: variable 'interface' from source: set_fact 15247 1726867242.28729: variable 'interface' from source: set_fact 15247 1726867242.28732: variable '__network_packages_default_team' from source: role '' defaults 15247 1726867242.28769: variable '__network_team_connections_defined' from source: role '' defaults 15247 1726867242.29065: variable 'network_connections' from source: play vars 15247 1726867242.29068: variable 'interface' from source: set_fact 15247 1726867242.29137: variable 'interface' from source: set_fact 15247 1726867242.29142: variable 'interface' from source: set_fact 15247 1726867242.29206: variable 'interface' from source: set_fact 15247 1726867242.29249: variable '__network_service_name_default_initscripts' from source: role '' defaults 15247 1726867242.29297: variable '__network_service_name_default_initscripts' from source: role '' defaults 15247 1726867242.29300: variable '__network_packages_default_initscripts' from source: role '' defaults 15247 1726867242.29343: variable '__network_packages_default_initscripts' from source: role '' defaults 15247 1726867242.29490: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15247 1726867242.29795: variable 'network_connections' from source: play vars 15247 1726867242.29799: variable 'interface' from source: set_fact 15247 1726867242.29847: variable 'interface' from source: set_fact 15247 1726867242.29852: variable 'interface' from source: set_fact 15247 1726867242.29895: variable 'interface' from source: set_fact 15247 1726867242.29902: variable 'ansible_distribution' from source: facts 15247 1726867242.29905: variable '__network_rh_distros' from source: role '' defaults 15247 1726867242.29913: variable 'ansible_distribution_major_version' from source: facts 15247 1726867242.29941: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15247 1726867242.30046: variable 'ansible_distribution' from source: facts 15247 1726867242.30051: variable '__network_rh_distros' from source: role '' defaults 15247 1726867242.30054: variable 'ansible_distribution_major_version' from source: facts 15247 1726867242.30061: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15247 1726867242.30169: variable 'ansible_distribution' from source: facts 15247 1726867242.30172: variable '__network_rh_distros' from source: role '' defaults 15247 1726867242.30175: variable 'ansible_distribution_major_version' from source: facts 15247 1726867242.30202: variable 'network_provider' from source: set_fact 15247 1726867242.30215: variable 'ansible_facts' from source: unknown 15247 1726867242.30641: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15247 1726867242.30645: when evaluation is False, skipping this task 15247 1726867242.30647: _execute() done 15247 1726867242.30649: dumping result to json 15247 1726867242.30651: done dumping result, returning 15247 1726867242.30659: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-8ce3-1923-00000000001c] 15247 1726867242.30665: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000001c 15247 1726867242.30751: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000001c 15247 1726867242.30754: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15247 1726867242.30801: no more pending results, returning what we have 15247 1726867242.30805: results queue empty 15247 1726867242.30805: checking for any_errors_fatal 15247 1726867242.30811: done checking for any_errors_fatal 15247 1726867242.30811: checking for max_fail_percentage 15247 1726867242.30813: done checking for max_fail_percentage 15247 1726867242.30814: checking to see if all hosts have failed and the running result is not ok 15247 1726867242.30815: done checking to see if all hosts have failed 15247 1726867242.30815: getting the remaining hosts for this loop 15247 1726867242.30817: done getting the remaining hosts for this loop 15247 1726867242.30820: getting the next task for host managed_node2 15247 1726867242.30825: done getting next task for host managed_node2 15247 1726867242.30829: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15247 1726867242.30830: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867242.30842: getting variables 15247 1726867242.30844: in VariableManager get_vars() 15247 1726867242.30881: Calling all_inventory to load vars for managed_node2 15247 1726867242.30884: Calling groups_inventory to load vars for managed_node2 15247 1726867242.30886: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867242.30899: Calling all_plugins_play to load vars for managed_node2 15247 1726867242.30901: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867242.30904: Calling groups_plugins_play to load vars for managed_node2 15247 1726867242.31709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867242.33021: done with get_vars() 15247 1726867242.33038: done getting variables 15247 1726867242.33084: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:20:42 -0400 (0:00:00.109) 0:00:12.040 ****** 15247 1726867242.33108: entering _queue_task() for managed_node2/package 15247 1726867242.33325: worker is 1 (out of 1 available) 15247 1726867242.33338: exiting _queue_task() for managed_node2/package 15247 1726867242.33351: done queuing things up, now waiting for results queue to drain 15247 1726867242.33352: waiting for pending results... 15247 1726867242.33520: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15247 1726867242.33583: in run() - task 0affcac9-a3a5-8ce3-1923-00000000001d 15247 1726867242.33592: variable 'ansible_search_path' from source: unknown 15247 1726867242.33596: variable 'ansible_search_path' from source: unknown 15247 1726867242.33626: calling self._execute() 15247 1726867242.33690: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867242.33694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867242.33706: variable 'omit' from source: magic vars 15247 1726867242.33971: variable 'ansible_distribution_major_version' from source: facts 15247 1726867242.33980: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867242.34063: variable 'network_state' from source: role '' defaults 15247 1726867242.34071: Evaluated conditional (network_state != {}): False 15247 1726867242.34074: when evaluation is False, skipping this task 15247 1726867242.34078: _execute() done 15247 1726867242.34081: dumping result to json 15247 1726867242.34084: done dumping result, returning 15247 1726867242.34092: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-8ce3-1923-00000000001d] 15247 1726867242.34098: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000001d 15247 1726867242.34193: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000001d 15247 1726867242.34196: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15247 1726867242.34265: no more pending results, returning what we have 15247 1726867242.34268: results queue empty 15247 1726867242.34269: checking for any_errors_fatal 15247 1726867242.34273: done checking for any_errors_fatal 15247 1726867242.34274: checking for max_fail_percentage 15247 1726867242.34276: done checking for max_fail_percentage 15247 1726867242.34278: checking to see if all hosts have failed and the running result is not ok 15247 1726867242.34280: done checking to see if all hosts have failed 15247 1726867242.34280: getting the remaining hosts for this loop 15247 1726867242.34282: done getting the remaining hosts for this loop 15247 1726867242.34285: getting the next task for host managed_node2 15247 1726867242.34290: done getting next task for host managed_node2 15247 1726867242.34293: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15247 1726867242.34295: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867242.34311: getting variables 15247 1726867242.34312: in VariableManager get_vars() 15247 1726867242.34341: Calling all_inventory to load vars for managed_node2 15247 1726867242.34343: Calling groups_inventory to load vars for managed_node2 15247 1726867242.34345: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867242.34353: Calling all_plugins_play to load vars for managed_node2 15247 1726867242.34355: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867242.34357: Calling groups_plugins_play to load vars for managed_node2 15247 1726867242.38644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867242.39875: done with get_vars() 15247 1726867242.39902: done getting variables 15247 1726867242.39940: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:20:42 -0400 (0:00:00.068) 0:00:12.109 ****** 15247 1726867242.39960: entering _queue_task() for managed_node2/package 15247 1726867242.40281: worker is 1 (out of 1 available) 15247 1726867242.40293: exiting _queue_task() for managed_node2/package 15247 1726867242.40306: done queuing things up, now waiting for results queue to drain 15247 1726867242.40307: waiting for pending results... 15247 1726867242.40519: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15247 1726867242.40593: in run() - task 0affcac9-a3a5-8ce3-1923-00000000001e 15247 1726867242.40618: variable 'ansible_search_path' from source: unknown 15247 1726867242.40621: variable 'ansible_search_path' from source: unknown 15247 1726867242.40648: calling self._execute() 15247 1726867242.40720: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867242.40724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867242.40733: variable 'omit' from source: magic vars 15247 1726867242.41055: variable 'ansible_distribution_major_version' from source: facts 15247 1726867242.41058: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867242.41145: variable 'network_state' from source: role '' defaults 15247 1726867242.41153: Evaluated conditional (network_state != {}): False 15247 1726867242.41156: when evaluation is False, skipping this task 15247 1726867242.41158: _execute() done 15247 1726867242.41161: dumping result to json 15247 1726867242.41163: done dumping result, returning 15247 1726867242.41171: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-8ce3-1923-00000000001e] 15247 1726867242.41176: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000001e 15247 1726867242.41278: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000001e 15247 1726867242.41281: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15247 1726867242.41340: no more pending results, returning what we have 15247 1726867242.41343: results queue empty 15247 1726867242.41344: checking for any_errors_fatal 15247 1726867242.41355: done checking for any_errors_fatal 15247 1726867242.41356: checking for max_fail_percentage 15247 1726867242.41358: done checking for max_fail_percentage 15247 1726867242.41358: checking to see if all hosts have failed and the running result is not ok 15247 1726867242.41359: done checking to see if all hosts have failed 15247 1726867242.41360: getting the remaining hosts for this loop 15247 1726867242.41361: done getting the remaining hosts for this loop 15247 1726867242.41365: getting the next task for host managed_node2 15247 1726867242.41370: done getting next task for host managed_node2 15247 1726867242.41373: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15247 1726867242.41375: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867242.41392: getting variables 15247 1726867242.41393: in VariableManager get_vars() 15247 1726867242.41426: Calling all_inventory to load vars for managed_node2 15247 1726867242.41429: Calling groups_inventory to load vars for managed_node2 15247 1726867242.41431: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867242.41439: Calling all_plugins_play to load vars for managed_node2 15247 1726867242.41441: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867242.41443: Calling groups_plugins_play to load vars for managed_node2 15247 1726867242.42311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867242.43764: done with get_vars() 15247 1726867242.43789: done getting variables 15247 1726867242.43903: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:20:42 -0400 (0:00:00.039) 0:00:12.148 ****** 15247 1726867242.43937: entering _queue_task() for managed_node2/service 15247 1726867242.43939: Creating lock for service 15247 1726867242.44267: worker is 1 (out of 1 available) 15247 1726867242.44292: exiting _queue_task() for managed_node2/service 15247 1726867242.44308: done queuing things up, now waiting for results queue to drain 15247 1726867242.44309: waiting for pending results... 15247 1726867242.44697: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15247 1726867242.44702: in run() - task 0affcac9-a3a5-8ce3-1923-00000000001f 15247 1726867242.44719: variable 'ansible_search_path' from source: unknown 15247 1726867242.44722: variable 'ansible_search_path' from source: unknown 15247 1726867242.44724: calling self._execute() 15247 1726867242.44767: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867242.44772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867242.44863: variable 'omit' from source: magic vars 15247 1726867242.45188: variable 'ansible_distribution_major_version' from source: facts 15247 1726867242.45209: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867242.45391: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867242.45522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867242.48093: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867242.48145: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867242.48171: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867242.48199: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867242.48219: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867242.48280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.48302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.48321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.48347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.48361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.48395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.48414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.48431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.48456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.48469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.48499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.48518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.48534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.48561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.48575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.48680: variable 'network_connections' from source: play vars 15247 1726867242.48692: variable 'interface' from source: set_fact 15247 1726867242.48743: variable 'interface' from source: set_fact 15247 1726867242.48751: variable 'interface' from source: set_fact 15247 1726867242.48798: variable 'interface' from source: set_fact 15247 1726867242.48843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867242.48950: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867242.48976: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867242.49024: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867242.49044: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867242.49074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867242.49091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867242.49111: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.49130: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867242.49173: variable '__network_team_connections_defined' from source: role '' defaults 15247 1726867242.49320: variable 'network_connections' from source: play vars 15247 1726867242.49324: variable 'interface' from source: set_fact 15247 1726867242.49374: variable 'interface' from source: set_fact 15247 1726867242.49380: variable 'interface' from source: set_fact 15247 1726867242.49427: variable 'interface' from source: set_fact 15247 1726867242.49450: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15247 1726867242.49454: when evaluation is False, skipping this task 15247 1726867242.49458: _execute() done 15247 1726867242.49467: dumping result to json 15247 1726867242.49469: done dumping result, returning 15247 1726867242.49507: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-8ce3-1923-00000000001f] 15247 1726867242.49518: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000001f 15247 1726867242.49583: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000001f 15247 1726867242.49585: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15247 1726867242.49667: no more pending results, returning what we have 15247 1726867242.49670: results queue empty 15247 1726867242.49671: checking for any_errors_fatal 15247 1726867242.49676: done checking for any_errors_fatal 15247 1726867242.49679: checking for max_fail_percentage 15247 1726867242.49680: done checking for max_fail_percentage 15247 1726867242.49681: checking to see if all hosts have failed and the running result is not ok 15247 1726867242.49682: done checking to see if all hosts have failed 15247 1726867242.49682: getting the remaining hosts for this loop 15247 1726867242.49684: done getting the remaining hosts for this loop 15247 1726867242.49687: getting the next task for host managed_node2 15247 1726867242.49694: done getting next task for host managed_node2 15247 1726867242.49699: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15247 1726867242.49700: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867242.49712: getting variables 15247 1726867242.49714: in VariableManager get_vars() 15247 1726867242.49746: Calling all_inventory to load vars for managed_node2 15247 1726867242.49748: Calling groups_inventory to load vars for managed_node2 15247 1726867242.49750: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867242.49759: Calling all_plugins_play to load vars for managed_node2 15247 1726867242.49761: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867242.49763: Calling groups_plugins_play to load vars for managed_node2 15247 1726867242.51165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867242.52110: done with get_vars() 15247 1726867242.52134: done getting variables 15247 1726867242.52176: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:20:42 -0400 (0:00:00.082) 0:00:12.231 ****** 15247 1726867242.52202: entering _queue_task() for managed_node2/service 15247 1726867242.52456: worker is 1 (out of 1 available) 15247 1726867242.52474: exiting _queue_task() for managed_node2/service 15247 1726867242.52487: done queuing things up, now waiting for results queue to drain 15247 1726867242.52489: waiting for pending results... 15247 1726867242.52688: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15247 1726867242.52755: in run() - task 0affcac9-a3a5-8ce3-1923-000000000020 15247 1726867242.52766: variable 'ansible_search_path' from source: unknown 15247 1726867242.52771: variable 'ansible_search_path' from source: unknown 15247 1726867242.52801: calling self._execute() 15247 1726867242.52867: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867242.52872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867242.52885: variable 'omit' from source: magic vars 15247 1726867242.53166: variable 'ansible_distribution_major_version' from source: facts 15247 1726867242.53175: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867242.53289: variable 'network_provider' from source: set_fact 15247 1726867242.53292: variable 'network_state' from source: role '' defaults 15247 1726867242.53317: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15247 1726867242.53320: variable 'omit' from source: magic vars 15247 1726867242.53341: variable 'omit' from source: magic vars 15247 1726867242.53362: variable 'network_service_name' from source: role '' defaults 15247 1726867242.53425: variable 'network_service_name' from source: role '' defaults 15247 1726867242.53499: variable '__network_provider_setup' from source: role '' defaults 15247 1726867242.53504: variable '__network_service_name_default_nm' from source: role '' defaults 15247 1726867242.53551: variable '__network_service_name_default_nm' from source: role '' defaults 15247 1726867242.53557: variable '__network_packages_default_nm' from source: role '' defaults 15247 1726867242.53616: variable '__network_packages_default_nm' from source: role '' defaults 15247 1726867242.53816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867242.55910: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867242.55974: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867242.56011: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867242.56036: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867242.56065: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867242.56119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.56139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.56159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.56190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.56200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.56233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.56248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.56266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.56296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.56310: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.56484: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15247 1726867242.56559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.56576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.56598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.56626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.56636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.56701: variable 'ansible_python' from source: facts 15247 1726867242.56716: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15247 1726867242.56770: variable '__network_wpa_supplicant_required' from source: role '' defaults 15247 1726867242.56827: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15247 1726867242.56909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.56930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.56948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.56972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.56984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.57017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867242.57040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867242.57059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.57085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867242.57096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867242.57187: variable 'network_connections' from source: play vars 15247 1726867242.57193: variable 'interface' from source: set_fact 15247 1726867242.57245: variable 'interface' from source: set_fact 15247 1726867242.57254: variable 'interface' from source: set_fact 15247 1726867242.57309: variable 'interface' from source: set_fact 15247 1726867242.57381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867242.57519: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867242.57552: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867242.57585: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867242.57617: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867242.57661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867242.57682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867242.57711: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867242.57734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867242.57768: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867242.57946: variable 'network_connections' from source: play vars 15247 1726867242.57952: variable 'interface' from source: set_fact 15247 1726867242.58004: variable 'interface' from source: set_fact 15247 1726867242.58015: variable 'interface' from source: set_fact 15247 1726867242.58067: variable 'interface' from source: set_fact 15247 1726867242.58101: variable '__network_packages_default_wireless' from source: role '' defaults 15247 1726867242.58158: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867242.58382: variable 'network_connections' from source: play vars 15247 1726867242.58386: variable 'interface' from source: set_fact 15247 1726867242.58447: variable 'interface' from source: set_fact 15247 1726867242.58451: variable 'interface' from source: set_fact 15247 1726867242.58534: variable 'interface' from source: set_fact 15247 1726867242.58616: variable '__network_packages_default_team' from source: role '' defaults 15247 1726867242.58893: variable '__network_team_connections_defined' from source: role '' defaults 15247 1726867242.58928: variable 'network_connections' from source: play vars 15247 1726867242.58931: variable 'interface' from source: set_fact 15247 1726867242.58996: variable 'interface' from source: set_fact 15247 1726867242.59003: variable 'interface' from source: set_fact 15247 1726867242.59071: variable 'interface' from source: set_fact 15247 1726867242.59133: variable '__network_service_name_default_initscripts' from source: role '' defaults 15247 1726867242.59190: variable '__network_service_name_default_initscripts' from source: role '' defaults 15247 1726867242.59196: variable '__network_packages_default_initscripts' from source: role '' defaults 15247 1726867242.59255: variable '__network_packages_default_initscripts' from source: role '' defaults 15247 1726867242.59462: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15247 1726867242.60131: variable 'network_connections' from source: play vars 15247 1726867242.60137: variable 'interface' from source: set_fact 15247 1726867242.60200: variable 'interface' from source: set_fact 15247 1726867242.60220: variable 'interface' from source: set_fact 15247 1726867242.60266: variable 'interface' from source: set_fact 15247 1726867242.60275: variable 'ansible_distribution' from source: facts 15247 1726867242.60280: variable '__network_rh_distros' from source: role '' defaults 15247 1726867242.60286: variable 'ansible_distribution_major_version' from source: facts 15247 1726867242.60360: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15247 1726867242.60522: variable 'ansible_distribution' from source: facts 15247 1726867242.60525: variable '__network_rh_distros' from source: role '' defaults 15247 1726867242.60528: variable 'ansible_distribution_major_version' from source: facts 15247 1726867242.60530: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15247 1726867242.60745: variable 'ansible_distribution' from source: facts 15247 1726867242.60748: variable '__network_rh_distros' from source: role '' defaults 15247 1726867242.60750: variable 'ansible_distribution_major_version' from source: facts 15247 1726867242.60752: variable 'network_provider' from source: set_fact 15247 1726867242.60754: variable 'omit' from source: magic vars 15247 1726867242.60853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867242.60857: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867242.60860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867242.60862: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867242.60869: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867242.60946: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867242.60949: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867242.60951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867242.61020: Set connection var ansible_shell_executable to /bin/sh 15247 1726867242.61023: Set connection var ansible_connection to ssh 15247 1726867242.61025: Set connection var ansible_shell_type to sh 15247 1726867242.61029: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867242.61031: Set connection var ansible_timeout to 10 15247 1726867242.61032: Set connection var ansible_pipelining to False 15247 1726867242.61162: variable 'ansible_shell_executable' from source: unknown 15247 1726867242.61165: variable 'ansible_connection' from source: unknown 15247 1726867242.61167: variable 'ansible_module_compression' from source: unknown 15247 1726867242.61169: variable 'ansible_shell_type' from source: unknown 15247 1726867242.61171: variable 'ansible_shell_executable' from source: unknown 15247 1726867242.61173: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867242.61181: variable 'ansible_pipelining' from source: unknown 15247 1726867242.61183: variable 'ansible_timeout' from source: unknown 15247 1726867242.61185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867242.61188: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867242.61190: variable 'omit' from source: magic vars 15247 1726867242.61192: starting attempt loop 15247 1726867242.61194: running the handler 15247 1726867242.61313: variable 'ansible_facts' from source: unknown 15247 1726867242.61899: _low_level_execute_command(): starting 15247 1726867242.61907: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867242.62416: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867242.62420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867242.62423: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867242.62425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867242.62483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867242.62501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867242.62541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867242.64231: stdout chunk (state=3): >>>/root <<< 15247 1726867242.64375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867242.64380: stdout chunk (state=3): >>><<< 15247 1726867242.64389: stderr chunk (state=3): >>><<< 15247 1726867242.64402: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867242.64415: _low_level_execute_command(): starting 15247 1726867242.64422: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867242.6440244-15924-146825310791444 `" && echo ansible-tmp-1726867242.6440244-15924-146825310791444="` echo /root/.ansible/tmp/ansible-tmp-1726867242.6440244-15924-146825310791444 `" ) && sleep 0' 15247 1726867242.64846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867242.64849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867242.64852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867242.64854: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867242.64859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867242.64973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867242.64976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867242.66856: stdout chunk (state=3): >>>ansible-tmp-1726867242.6440244-15924-146825310791444=/root/.ansible/tmp/ansible-tmp-1726867242.6440244-15924-146825310791444 <<< 15247 1726867242.66965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867242.66993: stderr chunk (state=3): >>><<< 15247 1726867242.66997: stdout chunk (state=3): >>><<< 15247 1726867242.67002: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867242.6440244-15924-146825310791444=/root/.ansible/tmp/ansible-tmp-1726867242.6440244-15924-146825310791444 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867242.67051: variable 'ansible_module_compression' from source: unknown 15247 1726867242.67137: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 15247 1726867242.67141: ANSIBALLZ: Acquiring lock 15247 1726867242.67143: ANSIBALLZ: Lock acquired: 140393880930304 15247 1726867242.67145: ANSIBALLZ: Creating module 15247 1726867242.89312: ANSIBALLZ: Writing module into payload 15247 1726867242.89423: ANSIBALLZ: Writing module 15247 1726867242.89444: ANSIBALLZ: Renaming module 15247 1726867242.89450: ANSIBALLZ: Done creating module 15247 1726867242.89485: variable 'ansible_facts' from source: unknown 15247 1726867242.89621: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867242.6440244-15924-146825310791444/AnsiballZ_systemd.py 15247 1726867242.89725: Sending initial data 15247 1726867242.89729: Sent initial data (156 bytes) 15247 1726867242.90157: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867242.90192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867242.90195: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867242.90197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867242.90200: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867242.90202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867242.90203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867242.90249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867242.90256: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867242.90317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867242.92062: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867242.92132: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867242.92192: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpt22j6h5m /root/.ansible/tmp/ansible-tmp-1726867242.6440244-15924-146825310791444/AnsiballZ_systemd.py <<< 15247 1726867242.92196: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867242.6440244-15924-146825310791444/AnsiballZ_systemd.py" <<< 15247 1726867242.92264: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpt22j6h5m" to remote "/root/.ansible/tmp/ansible-tmp-1726867242.6440244-15924-146825310791444/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867242.6440244-15924-146825310791444/AnsiballZ_systemd.py" <<< 15247 1726867242.93994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867242.94109: stdout chunk (state=3): >>><<< 15247 1726867242.94113: stderr chunk (state=3): >>><<< 15247 1726867242.94115: done transferring module to remote 15247 1726867242.94118: _low_level_execute_command(): starting 15247 1726867242.94120: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867242.6440244-15924-146825310791444/ /root/.ansible/tmp/ansible-tmp-1726867242.6440244-15924-146825310791444/AnsiballZ_systemd.py && sleep 0' 15247 1726867242.94580: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867242.94593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867242.94607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867242.94650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867242.94671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867242.94708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867242.96852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867242.96855: stdout chunk (state=3): >>><<< 15247 1726867242.96858: stderr chunk (state=3): >>><<< 15247 1726867242.96860: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867242.96862: _low_level_execute_command(): starting 15247 1726867242.96864: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867242.6440244-15924-146825310791444/AnsiballZ_systemd.py && sleep 0' 15247 1726867242.97725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867242.97733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867242.97817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867242.97846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867242.97908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867243.27369: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4460544", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3309223936", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "712765000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-<<< 15247 1726867243.27383: stdout chunk (state=3): >>>broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15247 1726867243.29230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867243.29234: stdout chunk (state=3): >>><<< 15247 1726867243.29238: stderr chunk (state=3): >>><<< 15247 1726867243.29257: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4460544", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3309223936", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "712765000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867243.29376: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867242.6440244-15924-146825310791444/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867243.29396: _low_level_execute_command(): starting 15247 1726867243.29399: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867242.6440244-15924-146825310791444/ > /dev/null 2>&1 && sleep 0' 15247 1726867243.29998: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867243.30004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867243.30009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867243.30012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867243.30051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867243.30127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867243.32082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867243.32085: stdout chunk (state=3): >>><<< 15247 1726867243.32088: stderr chunk (state=3): >>><<< 15247 1726867243.32090: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867243.32094: handler run complete 15247 1726867243.32121: attempt loop complete, returning result 15247 1726867243.32135: _execute() done 15247 1726867243.32144: dumping result to json 15247 1726867243.32168: done dumping result, returning 15247 1726867243.32188: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-8ce3-1923-000000000020] 15247 1726867243.32208: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000020 15247 1726867243.32548: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000020 15247 1726867243.32555: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15247 1726867243.32637: no more pending results, returning what we have 15247 1726867243.32640: results queue empty 15247 1726867243.32641: checking for any_errors_fatal 15247 1726867243.32646: done checking for any_errors_fatal 15247 1726867243.32647: checking for max_fail_percentage 15247 1726867243.32648: done checking for max_fail_percentage 15247 1726867243.32649: checking to see if all hosts have failed and the running result is not ok 15247 1726867243.32650: done checking to see if all hosts have failed 15247 1726867243.32651: getting the remaining hosts for this loop 15247 1726867243.32652: done getting the remaining hosts for this loop 15247 1726867243.32655: getting the next task for host managed_node2 15247 1726867243.32660: done getting next task for host managed_node2 15247 1726867243.32663: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15247 1726867243.32664: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867243.32673: getting variables 15247 1726867243.32674: in VariableManager get_vars() 15247 1726867243.32707: Calling all_inventory to load vars for managed_node2 15247 1726867243.32709: Calling groups_inventory to load vars for managed_node2 15247 1726867243.32711: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867243.32720: Calling all_plugins_play to load vars for managed_node2 15247 1726867243.32722: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867243.32724: Calling groups_plugins_play to load vars for managed_node2 15247 1726867243.34161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867243.35701: done with get_vars() 15247 1726867243.35721: done getting variables 15247 1726867243.35781: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:20:43 -0400 (0:00:00.836) 0:00:13.067 ****** 15247 1726867243.35810: entering _queue_task() for managed_node2/service 15247 1726867243.36107: worker is 1 (out of 1 available) 15247 1726867243.36121: exiting _queue_task() for managed_node2/service 15247 1726867243.36132: done queuing things up, now waiting for results queue to drain 15247 1726867243.36134: waiting for pending results... 15247 1726867243.36702: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15247 1726867243.36716: in run() - task 0affcac9-a3a5-8ce3-1923-000000000021 15247 1726867243.36719: variable 'ansible_search_path' from source: unknown 15247 1726867243.36722: variable 'ansible_search_path' from source: unknown 15247 1726867243.36724: calling self._execute() 15247 1726867243.36727: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867243.36730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867243.36732: variable 'omit' from source: magic vars 15247 1726867243.37126: variable 'ansible_distribution_major_version' from source: facts 15247 1726867243.37130: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867243.37167: variable 'network_provider' from source: set_fact 15247 1726867243.37181: Evaluated conditional (network_provider == "nm"): True 15247 1726867243.37284: variable '__network_wpa_supplicant_required' from source: role '' defaults 15247 1726867243.37394: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15247 1726867243.37601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867243.39719: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867243.39784: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867243.39808: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867243.39937: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867243.39940: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867243.39958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867243.39982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867243.40015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867243.40042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867243.40054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867243.40089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867243.40108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867243.40125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867243.40165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867243.40175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867243.40207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867243.40223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867243.40243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867243.40267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867243.40279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867243.40373: variable 'network_connections' from source: play vars 15247 1726867243.40384: variable 'interface' from source: set_fact 15247 1726867243.40434: variable 'interface' from source: set_fact 15247 1726867243.40442: variable 'interface' from source: set_fact 15247 1726867243.40487: variable 'interface' from source: set_fact 15247 1726867243.40536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867243.40646: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867243.40686: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867243.40730: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867243.40748: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867243.40798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867243.40826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867243.40847: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867243.40880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867243.40929: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867243.41282: variable 'network_connections' from source: play vars 15247 1726867243.41285: variable 'interface' from source: set_fact 15247 1726867243.41287: variable 'interface' from source: set_fact 15247 1726867243.41289: variable 'interface' from source: set_fact 15247 1726867243.41291: variable 'interface' from source: set_fact 15247 1726867243.41330: Evaluated conditional (__network_wpa_supplicant_required): False 15247 1726867243.41338: when evaluation is False, skipping this task 15247 1726867243.41346: _execute() done 15247 1726867243.41361: dumping result to json 15247 1726867243.41369: done dumping result, returning 15247 1726867243.41390: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-8ce3-1923-000000000021] 15247 1726867243.41406: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000021 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15247 1726867243.41559: no more pending results, returning what we have 15247 1726867243.41562: results queue empty 15247 1726867243.41563: checking for any_errors_fatal 15247 1726867243.41610: done checking for any_errors_fatal 15247 1726867243.41611: checking for max_fail_percentage 15247 1726867243.41614: done checking for max_fail_percentage 15247 1726867243.41614: checking to see if all hosts have failed and the running result is not ok 15247 1726867243.41615: done checking to see if all hosts have failed 15247 1726867243.41616: getting the remaining hosts for this loop 15247 1726867243.41623: done getting the remaining hosts for this loop 15247 1726867243.41627: getting the next task for host managed_node2 15247 1726867243.41633: done getting next task for host managed_node2 15247 1726867243.41636: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15247 1726867243.41638: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867243.41648: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000021 15247 1726867243.41680: WORKER PROCESS EXITING 15247 1726867243.41690: getting variables 15247 1726867243.41691: in VariableManager get_vars() 15247 1726867243.41735: Calling all_inventory to load vars for managed_node2 15247 1726867243.41738: Calling groups_inventory to load vars for managed_node2 15247 1726867243.41740: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867243.41749: Calling all_plugins_play to load vars for managed_node2 15247 1726867243.41751: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867243.41753: Calling groups_plugins_play to load vars for managed_node2 15247 1726867243.42928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867243.43973: done with get_vars() 15247 1726867243.43990: done getting variables 15247 1726867243.44054: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:20:43 -0400 (0:00:00.082) 0:00:13.150 ****** 15247 1726867243.44088: entering _queue_task() for managed_node2/service 15247 1726867243.44356: worker is 1 (out of 1 available) 15247 1726867243.44368: exiting _queue_task() for managed_node2/service 15247 1726867243.44662: done queuing things up, now waiting for results queue to drain 15247 1726867243.44664: waiting for pending results... 15247 1726867243.44713: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 15247 1726867243.44885: in run() - task 0affcac9-a3a5-8ce3-1923-000000000022 15247 1726867243.44890: variable 'ansible_search_path' from source: unknown 15247 1726867243.44893: variable 'ansible_search_path' from source: unknown 15247 1726867243.44907: calling self._execute() 15247 1726867243.44998: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867243.45009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867243.45084: variable 'omit' from source: magic vars 15247 1726867243.45394: variable 'ansible_distribution_major_version' from source: facts 15247 1726867243.45411: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867243.45520: variable 'network_provider' from source: set_fact 15247 1726867243.45533: Evaluated conditional (network_provider == "initscripts"): False 15247 1726867243.45545: when evaluation is False, skipping this task 15247 1726867243.45553: _execute() done 15247 1726867243.45560: dumping result to json 15247 1726867243.45567: done dumping result, returning 15247 1726867243.45580: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-8ce3-1923-000000000022] 15247 1726867243.45594: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000022 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15247 1726867243.45851: no more pending results, returning what we have 15247 1726867243.45854: results queue empty 15247 1726867243.45854: checking for any_errors_fatal 15247 1726867243.45860: done checking for any_errors_fatal 15247 1726867243.45860: checking for max_fail_percentage 15247 1726867243.45862: done checking for max_fail_percentage 15247 1726867243.45862: checking to see if all hosts have failed and the running result is not ok 15247 1726867243.45863: done checking to see if all hosts have failed 15247 1726867243.45864: getting the remaining hosts for this loop 15247 1726867243.45865: done getting the remaining hosts for this loop 15247 1726867243.45867: getting the next task for host managed_node2 15247 1726867243.45872: done getting next task for host managed_node2 15247 1726867243.45875: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15247 1726867243.45879: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867243.45890: getting variables 15247 1726867243.45891: in VariableManager get_vars() 15247 1726867243.45921: Calling all_inventory to load vars for managed_node2 15247 1726867243.45924: Calling groups_inventory to load vars for managed_node2 15247 1726867243.45926: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867243.45934: Calling all_plugins_play to load vars for managed_node2 15247 1726867243.45937: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867243.45940: Calling groups_plugins_play to load vars for managed_node2 15247 1726867243.46490: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000022 15247 1726867243.46494: WORKER PROCESS EXITING 15247 1726867243.47299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867243.48814: done with get_vars() 15247 1726867243.48838: done getting variables 15247 1726867243.48898: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:20:43 -0400 (0:00:00.048) 0:00:13.198 ****** 15247 1726867243.48931: entering _queue_task() for managed_node2/copy 15247 1726867243.49216: worker is 1 (out of 1 available) 15247 1726867243.49229: exiting _queue_task() for managed_node2/copy 15247 1726867243.49242: done queuing things up, now waiting for results queue to drain 15247 1726867243.49243: waiting for pending results... 15247 1726867243.49540: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15247 1726867243.49647: in run() - task 0affcac9-a3a5-8ce3-1923-000000000023 15247 1726867243.49674: variable 'ansible_search_path' from source: unknown 15247 1726867243.49689: variable 'ansible_search_path' from source: unknown 15247 1726867243.49743: calling self._execute() 15247 1726867243.49843: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867243.49856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867243.49873: variable 'omit' from source: magic vars 15247 1726867243.50274: variable 'ansible_distribution_major_version' from source: facts 15247 1726867243.50293: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867243.50414: variable 'network_provider' from source: set_fact 15247 1726867243.50427: Evaluated conditional (network_provider == "initscripts"): False 15247 1726867243.50456: when evaluation is False, skipping this task 15247 1726867243.50459: _execute() done 15247 1726867243.50461: dumping result to json 15247 1726867243.50463: done dumping result, returning 15247 1726867243.50466: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-8ce3-1923-000000000023] 15247 1726867243.50682: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000023 15247 1726867243.50745: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000023 15247 1726867243.50748: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15247 1726867243.50784: no more pending results, returning what we have 15247 1726867243.50788: results queue empty 15247 1726867243.50789: checking for any_errors_fatal 15247 1726867243.50794: done checking for any_errors_fatal 15247 1726867243.50795: checking for max_fail_percentage 15247 1726867243.50796: done checking for max_fail_percentage 15247 1726867243.50797: checking to see if all hosts have failed and the running result is not ok 15247 1726867243.50798: done checking to see if all hosts have failed 15247 1726867243.50798: getting the remaining hosts for this loop 15247 1726867243.50800: done getting the remaining hosts for this loop 15247 1726867243.50803: getting the next task for host managed_node2 15247 1726867243.50807: done getting next task for host managed_node2 15247 1726867243.50811: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15247 1726867243.50813: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867243.50824: getting variables 15247 1726867243.50826: in VariableManager get_vars() 15247 1726867243.50856: Calling all_inventory to load vars for managed_node2 15247 1726867243.50858: Calling groups_inventory to load vars for managed_node2 15247 1726867243.50860: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867243.50869: Calling all_plugins_play to load vars for managed_node2 15247 1726867243.50871: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867243.50874: Calling groups_plugins_play to load vars for managed_node2 15247 1726867243.52758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867243.54408: done with get_vars() 15247 1726867243.54429: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:20:43 -0400 (0:00:00.055) 0:00:13.254 ****** 15247 1726867243.54512: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 15247 1726867243.54514: Creating lock for fedora.linux_system_roles.network_connections 15247 1726867243.54814: worker is 1 (out of 1 available) 15247 1726867243.54826: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 15247 1726867243.54838: done queuing things up, now waiting for results queue to drain 15247 1726867243.54839: waiting for pending results... 15247 1726867243.55099: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15247 1726867243.55209: in run() - task 0affcac9-a3a5-8ce3-1923-000000000024 15247 1726867243.55230: variable 'ansible_search_path' from source: unknown 15247 1726867243.55237: variable 'ansible_search_path' from source: unknown 15247 1726867243.55276: calling self._execute() 15247 1726867243.55370: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867243.55385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867243.55400: variable 'omit' from source: magic vars 15247 1726867243.55760: variable 'ansible_distribution_major_version' from source: facts 15247 1726867243.55874: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867243.55880: variable 'omit' from source: magic vars 15247 1726867243.55912: variable 'omit' from source: magic vars 15247 1726867243.56067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867243.58285: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867243.58288: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867243.58305: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867243.58345: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867243.58376: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867243.58543: variable 'network_provider' from source: set_fact 15247 1726867243.58817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867243.58865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867243.59083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867243.59087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867243.59092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867243.59245: variable 'omit' from source: magic vars 15247 1726867243.59402: variable 'omit' from source: magic vars 15247 1726867243.59513: variable 'network_connections' from source: play vars 15247 1726867243.59531: variable 'interface' from source: set_fact 15247 1726867243.59612: variable 'interface' from source: set_fact 15247 1726867243.59626: variable 'interface' from source: set_fact 15247 1726867243.59686: variable 'interface' from source: set_fact 15247 1726867243.59794: variable 'omit' from source: magic vars 15247 1726867243.59801: variable '__lsr_ansible_managed' from source: task vars 15247 1726867243.59842: variable '__lsr_ansible_managed' from source: task vars 15247 1726867243.59974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15247 1726867243.60118: Loaded config def from plugin (lookup/template) 15247 1726867243.60121: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15247 1726867243.60148: File lookup term: get_ansible_managed.j2 15247 1726867243.60151: variable 'ansible_search_path' from source: unknown 15247 1726867243.60154: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15247 1726867243.60162: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15247 1726867243.60175: variable 'ansible_search_path' from source: unknown 15247 1726867243.64194: variable 'ansible_managed' from source: unknown 15247 1726867243.64265: variable 'omit' from source: magic vars 15247 1726867243.64301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867243.64314: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867243.64328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867243.64341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867243.64348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867243.64382: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867243.64385: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867243.64387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867243.64698: Set connection var ansible_shell_executable to /bin/sh 15247 1726867243.64701: Set connection var ansible_connection to ssh 15247 1726867243.64703: Set connection var ansible_shell_type to sh 15247 1726867243.64707: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867243.64710: Set connection var ansible_timeout to 10 15247 1726867243.64712: Set connection var ansible_pipelining to False 15247 1726867243.64714: variable 'ansible_shell_executable' from source: unknown 15247 1726867243.64716: variable 'ansible_connection' from source: unknown 15247 1726867243.64718: variable 'ansible_module_compression' from source: unknown 15247 1726867243.64720: variable 'ansible_shell_type' from source: unknown 15247 1726867243.64722: variable 'ansible_shell_executable' from source: unknown 15247 1726867243.64724: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867243.64726: variable 'ansible_pipelining' from source: unknown 15247 1726867243.64728: variable 'ansible_timeout' from source: unknown 15247 1726867243.64730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867243.64732: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867243.64740: variable 'omit' from source: magic vars 15247 1726867243.64742: starting attempt loop 15247 1726867243.64745: running the handler 15247 1726867243.64747: _low_level_execute_command(): starting 15247 1726867243.64749: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867243.65304: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867243.65310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867243.65321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867243.65334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867243.65345: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867243.65382: stderr chunk (state=3): >>>debug2: match not found <<< 15247 1726867243.65385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867243.65387: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15247 1726867243.65390: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 15247 1726867243.65392: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15247 1726867243.65394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867243.65402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867243.65413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867243.65422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867243.65429: stderr chunk (state=3): >>>debug2: match found <<< 15247 1726867243.65438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867243.65509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867243.65520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867243.65536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867243.65621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867243.67316: stdout chunk (state=3): >>>/root <<< 15247 1726867243.67432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867243.67435: stdout chunk (state=3): >>><<< 15247 1726867243.67463: stderr chunk (state=3): >>><<< 15247 1726867243.67466: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867243.67480: _low_level_execute_command(): starting 15247 1726867243.67488: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867243.6746662-15973-234277255531666 `" && echo ansible-tmp-1726867243.6746662-15973-234277255531666="` echo /root/.ansible/tmp/ansible-tmp-1726867243.6746662-15973-234277255531666 `" ) && sleep 0' 15247 1726867243.68033: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867243.68039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867243.68042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867243.68059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867243.68069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867243.68110: stderr chunk (state=3): >>>debug2: match not found <<< 15247 1726867243.68113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867243.68115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15247 1726867243.68118: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 15247 1726867243.68149: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15247 1726867243.68152: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867243.68155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867243.68157: stderr chunk (state=3): >>>debug2: match found <<< 15247 1726867243.68181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867243.68226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867243.68291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867243.70175: stdout chunk (state=3): >>>ansible-tmp-1726867243.6746662-15973-234277255531666=/root/.ansible/tmp/ansible-tmp-1726867243.6746662-15973-234277255531666 <<< 15247 1726867243.70280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867243.70482: stderr chunk (state=3): >>><<< 15247 1726867243.70486: stdout chunk (state=3): >>><<< 15247 1726867243.70488: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867243.6746662-15973-234277255531666=/root/.ansible/tmp/ansible-tmp-1726867243.6746662-15973-234277255531666 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867243.70496: variable 'ansible_module_compression' from source: unknown 15247 1726867243.70498: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 15247 1726867243.70500: ANSIBALLZ: Acquiring lock 15247 1726867243.70502: ANSIBALLZ: Lock acquired: 140393872432336 15247 1726867243.70504: ANSIBALLZ: Creating module 15247 1726867243.86229: ANSIBALLZ: Writing module into payload 15247 1726867243.86453: ANSIBALLZ: Writing module 15247 1726867243.86472: ANSIBALLZ: Renaming module 15247 1726867243.86479: ANSIBALLZ: Done creating module 15247 1726867243.86499: variable 'ansible_facts' from source: unknown 15247 1726867243.86561: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867243.6746662-15973-234277255531666/AnsiballZ_network_connections.py 15247 1726867243.86666: Sending initial data 15247 1726867243.86669: Sent initial data (168 bytes) 15247 1726867243.87151: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867243.87155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867243.87160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867243.87184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867243.87215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867243.87218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867243.87293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867243.88937: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867243.88975: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867243.89024: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp5rg8ghu2 /root/.ansible/tmp/ansible-tmp-1726867243.6746662-15973-234277255531666/AnsiballZ_network_connections.py <<< 15247 1726867243.89027: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867243.6746662-15973-234277255531666/AnsiballZ_network_connections.py" <<< 15247 1726867243.89073: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp5rg8ghu2" to remote "/root/.ansible/tmp/ansible-tmp-1726867243.6746662-15973-234277255531666/AnsiballZ_network_connections.py" <<< 15247 1726867243.89084: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867243.6746662-15973-234277255531666/AnsiballZ_network_connections.py" <<< 15247 1726867243.89988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867243.90084: stderr chunk (state=3): >>><<< 15247 1726867243.90087: stdout chunk (state=3): >>><<< 15247 1726867243.90089: done transferring module to remote 15247 1726867243.90092: _low_level_execute_command(): starting 15247 1726867243.90094: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867243.6746662-15973-234277255531666/ /root/.ansible/tmp/ansible-tmp-1726867243.6746662-15973-234277255531666/AnsiballZ_network_connections.py && sleep 0' 15247 1726867243.90464: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867243.90521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867243.90524: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867243.90526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867243.90529: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867243.90560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867243.90567: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867243.90620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867243.92376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867243.92400: stderr chunk (state=3): >>><<< 15247 1726867243.92403: stdout chunk (state=3): >>><<< 15247 1726867243.92417: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867243.92420: _low_level_execute_command(): starting 15247 1726867243.92424: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867243.6746662-15973-234277255531666/AnsiballZ_network_connections.py && sleep 0' 15247 1726867243.92962: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867243.92966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15247 1726867243.92968: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867243.92970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867243.93075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867243.93148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867243.93218: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867244.24239: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 0dcbd6e7-6c97-413c-bbd2-1db5f36f2a65\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 0dcbd6e7-6c97-413c-bbd2-1db5f36f2a65 (is-modified)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15247 1726867244.26159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867244.26163: stdout chunk (state=3): >>><<< 15247 1726867244.26165: stderr chunk (state=3): >>><<< 15247 1726867244.26195: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 0dcbd6e7-6c97-413c-bbd2-1db5f36f2a65\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 0dcbd6e7-6c97-413c-bbd2-1db5f36f2a65 (is-modified)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867244.26358: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'interface_name': 'LSR-TST-br31', 'state': 'up', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867243.6746662-15973-234277255531666/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867244.26361: _low_level_execute_command(): starting 15247 1726867244.26364: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867243.6746662-15973-234277255531666/ > /dev/null 2>&1 && sleep 0' 15247 1726867244.27618: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867244.27702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867244.27880: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867244.27890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867244.28014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867244.28165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867244.30188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867244.30220: stderr chunk (state=3): >>><<< 15247 1726867244.30229: stdout chunk (state=3): >>><<< 15247 1726867244.30254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867244.30272: handler run complete 15247 1726867244.30314: attempt loop complete, returning result 15247 1726867244.30322: _execute() done 15247 1726867244.30330: dumping result to json 15247 1726867244.30343: done dumping result, returning 15247 1726867244.30359: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-8ce3-1923-000000000024] 15247 1726867244.30372: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000024 changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 0dcbd6e7-6c97-413c-bbd2-1db5f36f2a65 [004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 0dcbd6e7-6c97-413c-bbd2-1db5f36f2a65 (is-modified) 15247 1726867244.30619: no more pending results, returning what we have 15247 1726867244.30622: results queue empty 15247 1726867244.30623: checking for any_errors_fatal 15247 1726867244.30632: done checking for any_errors_fatal 15247 1726867244.30633: checking for max_fail_percentage 15247 1726867244.30635: done checking for max_fail_percentage 15247 1726867244.30635: checking to see if all hosts have failed and the running result is not ok 15247 1726867244.30636: done checking to see if all hosts have failed 15247 1726867244.30637: getting the remaining hosts for this loop 15247 1726867244.30638: done getting the remaining hosts for this loop 15247 1726867244.30642: getting the next task for host managed_node2 15247 1726867244.30647: done getting next task for host managed_node2 15247 1726867244.30650: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15247 1726867244.30652: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867244.30660: getting variables 15247 1726867244.30661: in VariableManager get_vars() 15247 1726867244.30739: Calling all_inventory to load vars for managed_node2 15247 1726867244.30742: Calling groups_inventory to load vars for managed_node2 15247 1726867244.30744: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867244.30756: Calling all_plugins_play to load vars for managed_node2 15247 1726867244.30758: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867244.30762: Calling groups_plugins_play to load vars for managed_node2 15247 1726867244.32319: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000024 15247 1726867244.32322: WORKER PROCESS EXITING 15247 1726867244.33157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867244.35417: done with get_vars() 15247 1726867244.35453: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:20:44 -0400 (0:00:00.810) 0:00:14.064 ****** 15247 1726867244.35533: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 15247 1726867244.35535: Creating lock for fedora.linux_system_roles.network_state 15247 1726867244.36005: worker is 1 (out of 1 available) 15247 1726867244.36015: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 15247 1726867244.36024: done queuing things up, now waiting for results queue to drain 15247 1726867244.36025: waiting for pending results... 15247 1726867244.36392: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 15247 1726867244.36784: in run() - task 0affcac9-a3a5-8ce3-1923-000000000025 15247 1726867244.37294: variable 'ansible_search_path' from source: unknown 15247 1726867244.37297: variable 'ansible_search_path' from source: unknown 15247 1726867244.37300: calling self._execute() 15247 1726867244.37417: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867244.37496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867244.37592: variable 'omit' from source: magic vars 15247 1726867244.38955: variable 'ansible_distribution_major_version' from source: facts 15247 1726867244.39183: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867244.39261: variable 'network_state' from source: role '' defaults 15247 1726867244.39387: Evaluated conditional (network_state != {}): False 15247 1726867244.39395: when evaluation is False, skipping this task 15247 1726867244.39402: _execute() done 15247 1726867244.39433: dumping result to json 15247 1726867244.39440: done dumping result, returning 15247 1726867244.39451: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-8ce3-1923-000000000025] 15247 1726867244.39461: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000025 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15247 1726867244.39696: no more pending results, returning what we have 15247 1726867244.39700: results queue empty 15247 1726867244.39701: checking for any_errors_fatal 15247 1726867244.39712: done checking for any_errors_fatal 15247 1726867244.39712: checking for max_fail_percentage 15247 1726867244.39714: done checking for max_fail_percentage 15247 1726867244.39715: checking to see if all hosts have failed and the running result is not ok 15247 1726867244.39716: done checking to see if all hosts have failed 15247 1726867244.39717: getting the remaining hosts for this loop 15247 1726867244.39718: done getting the remaining hosts for this loop 15247 1726867244.39722: getting the next task for host managed_node2 15247 1726867244.39729: done getting next task for host managed_node2 15247 1726867244.39732: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15247 1726867244.39735: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867244.39753: getting variables 15247 1726867244.39755: in VariableManager get_vars() 15247 1726867244.39796: Calling all_inventory to load vars for managed_node2 15247 1726867244.39799: Calling groups_inventory to load vars for managed_node2 15247 1726867244.39802: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867244.39814: Calling all_plugins_play to load vars for managed_node2 15247 1726867244.39817: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867244.39820: Calling groups_plugins_play to load vars for managed_node2 15247 1726867244.41184: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000025 15247 1726867244.41188: WORKER PROCESS EXITING 15247 1726867244.44325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867244.45995: done with get_vars() 15247 1726867244.46018: done getting variables 15247 1726867244.46076: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:20:44 -0400 (0:00:00.105) 0:00:14.170 ****** 15247 1726867244.46110: entering _queue_task() for managed_node2/debug 15247 1726867244.46438: worker is 1 (out of 1 available) 15247 1726867244.46452: exiting _queue_task() for managed_node2/debug 15247 1726867244.46471: done queuing things up, now waiting for results queue to drain 15247 1726867244.46473: waiting for pending results... 15247 1726867244.46898: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15247 1726867244.47004: in run() - task 0affcac9-a3a5-8ce3-1923-000000000026 15247 1726867244.47042: variable 'ansible_search_path' from source: unknown 15247 1726867244.47050: variable 'ansible_search_path' from source: unknown 15247 1726867244.47092: calling self._execute() 15247 1726867244.47188: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867244.47201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867244.47215: variable 'omit' from source: magic vars 15247 1726867244.47587: variable 'ansible_distribution_major_version' from source: facts 15247 1726867244.47602: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867244.47611: variable 'omit' from source: magic vars 15247 1726867244.47650: variable 'omit' from source: magic vars 15247 1726867244.47696: variable 'omit' from source: magic vars 15247 1726867244.47738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867244.47783: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867244.47809: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867244.47832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867244.47904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867244.47938: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867244.47973: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867244.47983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867244.48090: Set connection var ansible_shell_executable to /bin/sh 15247 1726867244.48142: Set connection var ansible_connection to ssh 15247 1726867244.48146: Set connection var ansible_shell_type to sh 15247 1726867244.48148: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867244.48150: Set connection var ansible_timeout to 10 15247 1726867244.48155: Set connection var ansible_pipelining to False 15247 1726867244.48316: variable 'ansible_shell_executable' from source: unknown 15247 1726867244.48319: variable 'ansible_connection' from source: unknown 15247 1726867244.48322: variable 'ansible_module_compression' from source: unknown 15247 1726867244.48324: variable 'ansible_shell_type' from source: unknown 15247 1726867244.48325: variable 'ansible_shell_executable' from source: unknown 15247 1726867244.48327: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867244.48329: variable 'ansible_pipelining' from source: unknown 15247 1726867244.48331: variable 'ansible_timeout' from source: unknown 15247 1726867244.48333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867244.48390: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867244.48407: variable 'omit' from source: magic vars 15247 1726867244.48417: starting attempt loop 15247 1726867244.48435: running the handler 15247 1726867244.48564: variable '__network_connections_result' from source: set_fact 15247 1726867244.48621: handler run complete 15247 1726867244.48648: attempt loop complete, returning result 15247 1726867244.48655: _execute() done 15247 1726867244.48683: dumping result to json 15247 1726867244.48686: done dumping result, returning 15247 1726867244.48689: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-8ce3-1923-000000000026] 15247 1726867244.48693: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000026 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 0dcbd6e7-6c97-413c-bbd2-1db5f36f2a65", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 0dcbd6e7-6c97-413c-bbd2-1db5f36f2a65 (is-modified)" ] } 15247 1726867244.48854: no more pending results, returning what we have 15247 1726867244.48857: results queue empty 15247 1726867244.48859: checking for any_errors_fatal 15247 1726867244.48867: done checking for any_errors_fatal 15247 1726867244.48868: checking for max_fail_percentage 15247 1726867244.48870: done checking for max_fail_percentage 15247 1726867244.48871: checking to see if all hosts have failed and the running result is not ok 15247 1726867244.48872: done checking to see if all hosts have failed 15247 1726867244.48873: getting the remaining hosts for this loop 15247 1726867244.48874: done getting the remaining hosts for this loop 15247 1726867244.48880: getting the next task for host managed_node2 15247 1726867244.48887: done getting next task for host managed_node2 15247 1726867244.48890: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15247 1726867244.48893: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867244.48902: getting variables 15247 1726867244.48904: in VariableManager get_vars() 15247 1726867244.48939: Calling all_inventory to load vars for managed_node2 15247 1726867244.48942: Calling groups_inventory to load vars for managed_node2 15247 1726867244.48944: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867244.48956: Calling all_plugins_play to load vars for managed_node2 15247 1726867244.48959: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867244.48962: Calling groups_plugins_play to load vars for managed_node2 15247 1726867244.49652: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000026 15247 1726867244.49656: WORKER PROCESS EXITING 15247 1726867244.51617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867244.56807: done with get_vars() 15247 1726867244.56836: done getting variables 15247 1726867244.56915: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:20:44 -0400 (0:00:00.108) 0:00:14.279 ****** 15247 1726867244.56947: entering _queue_task() for managed_node2/debug 15247 1726867244.57379: worker is 1 (out of 1 available) 15247 1726867244.57392: exiting _queue_task() for managed_node2/debug 15247 1726867244.57404: done queuing things up, now waiting for results queue to drain 15247 1726867244.57408: waiting for pending results... 15247 1726867244.57684: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15247 1726867244.57797: in run() - task 0affcac9-a3a5-8ce3-1923-000000000027 15247 1726867244.57824: variable 'ansible_search_path' from source: unknown 15247 1726867244.57833: variable 'ansible_search_path' from source: unknown 15247 1726867244.57874: calling self._execute() 15247 1726867244.57975: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867244.57990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867244.58022: variable 'omit' from source: magic vars 15247 1726867244.58400: variable 'ansible_distribution_major_version' from source: facts 15247 1726867244.58419: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867244.58457: variable 'omit' from source: magic vars 15247 1726867244.58476: variable 'omit' from source: magic vars 15247 1726867244.58520: variable 'omit' from source: magic vars 15247 1726867244.58568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867244.58676: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867244.58681: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867244.58683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867244.58685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867244.58707: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867244.58716: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867244.58723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867244.58832: Set connection var ansible_shell_executable to /bin/sh 15247 1726867244.58841: Set connection var ansible_connection to ssh 15247 1726867244.58849: Set connection var ansible_shell_type to sh 15247 1726867244.58866: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867244.58896: Set connection var ansible_timeout to 10 15247 1726867244.58914: Set connection var ansible_pipelining to False 15247 1726867244.58940: variable 'ansible_shell_executable' from source: unknown 15247 1726867244.59014: variable 'ansible_connection' from source: unknown 15247 1726867244.59017: variable 'ansible_module_compression' from source: unknown 15247 1726867244.59020: variable 'ansible_shell_type' from source: unknown 15247 1726867244.59022: variable 'ansible_shell_executable' from source: unknown 15247 1726867244.59034: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867244.59036: variable 'ansible_pipelining' from source: unknown 15247 1726867244.59038: variable 'ansible_timeout' from source: unknown 15247 1726867244.59040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867244.59160: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867244.59180: variable 'omit' from source: magic vars 15247 1726867244.59191: starting attempt loop 15247 1726867244.59197: running the handler 15247 1726867244.59260: variable '__network_connections_result' from source: set_fact 15247 1726867244.59351: variable '__network_connections_result' from source: set_fact 15247 1726867244.59475: handler run complete 15247 1726867244.59515: attempt loop complete, returning result 15247 1726867244.59558: _execute() done 15247 1726867244.59561: dumping result to json 15247 1726867244.59564: done dumping result, returning 15247 1726867244.59566: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-8ce3-1923-000000000027] 15247 1726867244.59569: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000027 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 0dcbd6e7-6c97-413c-bbd2-1db5f36f2a65\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 0dcbd6e7-6c97-413c-bbd2-1db5f36f2a65 (is-modified)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, 0dcbd6e7-6c97-413c-bbd2-1db5f36f2a65", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, 0dcbd6e7-6c97-413c-bbd2-1db5f36f2a65 (is-modified)" ] } } 15247 1726867244.59866: no more pending results, returning what we have 15247 1726867244.59870: results queue empty 15247 1726867244.59871: checking for any_errors_fatal 15247 1726867244.59881: done checking for any_errors_fatal 15247 1726867244.59882: checking for max_fail_percentage 15247 1726867244.59884: done checking for max_fail_percentage 15247 1726867244.59885: checking to see if all hosts have failed and the running result is not ok 15247 1726867244.59886: done checking to see if all hosts have failed 15247 1726867244.59887: getting the remaining hosts for this loop 15247 1726867244.59888: done getting the remaining hosts for this loop 15247 1726867244.59893: getting the next task for host managed_node2 15247 1726867244.59900: done getting next task for host managed_node2 15247 1726867244.59903: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15247 1726867244.59908: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867244.59918: getting variables 15247 1726867244.59920: in VariableManager get_vars() 15247 1726867244.59954: Calling all_inventory to load vars for managed_node2 15247 1726867244.59957: Calling groups_inventory to load vars for managed_node2 15247 1726867244.59959: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867244.59970: Calling all_plugins_play to load vars for managed_node2 15247 1726867244.59973: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867244.59975: Calling groups_plugins_play to load vars for managed_node2 15247 1726867244.60195: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000027 15247 1726867244.60199: WORKER PROCESS EXITING 15247 1726867244.61737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867244.63359: done with get_vars() 15247 1726867244.63400: done getting variables 15247 1726867244.63476: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:20:44 -0400 (0:00:00.065) 0:00:14.344 ****** 15247 1726867244.63516: entering _queue_task() for managed_node2/debug 15247 1726867244.63929: worker is 1 (out of 1 available) 15247 1726867244.63940: exiting _queue_task() for managed_node2/debug 15247 1726867244.63951: done queuing things up, now waiting for results queue to drain 15247 1726867244.63953: waiting for pending results... 15247 1726867244.64347: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15247 1726867244.64536: in run() - task 0affcac9-a3a5-8ce3-1923-000000000028 15247 1726867244.64569: variable 'ansible_search_path' from source: unknown 15247 1726867244.64685: variable 'ansible_search_path' from source: unknown 15247 1726867244.64688: calling self._execute() 15247 1726867244.64747: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867244.64775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867244.64813: variable 'omit' from source: magic vars 15247 1726867244.65394: variable 'ansible_distribution_major_version' from source: facts 15247 1726867244.65457: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867244.65614: variable 'network_state' from source: role '' defaults 15247 1726867244.65638: Evaluated conditional (network_state != {}): False 15247 1726867244.65692: when evaluation is False, skipping this task 15247 1726867244.65695: _execute() done 15247 1726867244.65697: dumping result to json 15247 1726867244.65699: done dumping result, returning 15247 1726867244.65701: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-8ce3-1923-000000000028] 15247 1726867244.65707: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000028 15247 1726867244.65944: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000028 15247 1726867244.65947: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 15247 1726867244.66013: no more pending results, returning what we have 15247 1726867244.66018: results queue empty 15247 1726867244.66019: checking for any_errors_fatal 15247 1726867244.66028: done checking for any_errors_fatal 15247 1726867244.66029: checking for max_fail_percentage 15247 1726867244.66031: done checking for max_fail_percentage 15247 1726867244.66032: checking to see if all hosts have failed and the running result is not ok 15247 1726867244.66033: done checking to see if all hosts have failed 15247 1726867244.66034: getting the remaining hosts for this loop 15247 1726867244.66035: done getting the remaining hosts for this loop 15247 1726867244.66039: getting the next task for host managed_node2 15247 1726867244.66046: done getting next task for host managed_node2 15247 1726867244.66054: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15247 1726867244.66057: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867244.66070: getting variables 15247 1726867244.66071: in VariableManager get_vars() 15247 1726867244.66117: Calling all_inventory to load vars for managed_node2 15247 1726867244.66120: Calling groups_inventory to load vars for managed_node2 15247 1726867244.66122: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867244.66135: Calling all_plugins_play to load vars for managed_node2 15247 1726867244.66138: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867244.66141: Calling groups_plugins_play to load vars for managed_node2 15247 1726867244.68122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867244.69708: done with get_vars() 15247 1726867244.69730: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:20:44 -0400 (0:00:00.063) 0:00:14.407 ****** 15247 1726867244.69824: entering _queue_task() for managed_node2/ping 15247 1726867244.69826: Creating lock for ping 15247 1726867244.70120: worker is 1 (out of 1 available) 15247 1726867244.70133: exiting _queue_task() for managed_node2/ping 15247 1726867244.70143: done queuing things up, now waiting for results queue to drain 15247 1726867244.70144: waiting for pending results... 15247 1726867244.70493: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 15247 1726867244.70498: in run() - task 0affcac9-a3a5-8ce3-1923-000000000029 15247 1726867244.70501: variable 'ansible_search_path' from source: unknown 15247 1726867244.70503: variable 'ansible_search_path' from source: unknown 15247 1726867244.70520: calling self._execute() 15247 1726867244.70661: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867244.70687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867244.70727: variable 'omit' from source: magic vars 15247 1726867244.71430: variable 'ansible_distribution_major_version' from source: facts 15247 1726867244.71466: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867244.71471: variable 'omit' from source: magic vars 15247 1726867244.71572: variable 'omit' from source: magic vars 15247 1726867244.71665: variable 'omit' from source: magic vars 15247 1726867244.71784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867244.71892: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867244.71904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867244.72008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867244.72065: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867244.72123: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867244.72126: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867244.72128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867244.72308: Set connection var ansible_shell_executable to /bin/sh 15247 1726867244.72311: Set connection var ansible_connection to ssh 15247 1726867244.72314: Set connection var ansible_shell_type to sh 15247 1726867244.72434: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867244.72437: Set connection var ansible_timeout to 10 15247 1726867244.72442: Set connection var ansible_pipelining to False 15247 1726867244.72444: variable 'ansible_shell_executable' from source: unknown 15247 1726867244.72447: variable 'ansible_connection' from source: unknown 15247 1726867244.72449: variable 'ansible_module_compression' from source: unknown 15247 1726867244.72451: variable 'ansible_shell_type' from source: unknown 15247 1726867244.72453: variable 'ansible_shell_executable' from source: unknown 15247 1726867244.72455: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867244.72457: variable 'ansible_pipelining' from source: unknown 15247 1726867244.72460: variable 'ansible_timeout' from source: unknown 15247 1726867244.72461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867244.72799: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867244.72810: variable 'omit' from source: magic vars 15247 1726867244.72813: starting attempt loop 15247 1726867244.72815: running the handler 15247 1726867244.72855: _low_level_execute_command(): starting 15247 1726867244.72867: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867244.73921: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867244.73946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867244.74016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867244.75740: stdout chunk (state=3): >>>/root <<< 15247 1726867244.75884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867244.75888: stdout chunk (state=3): >>><<< 15247 1726867244.75890: stderr chunk (state=3): >>><<< 15247 1726867244.76016: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867244.76020: _low_level_execute_command(): starting 15247 1726867244.76024: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867244.7594388-16029-217839489788791 `" && echo ansible-tmp-1726867244.7594388-16029-217839489788791="` echo /root/.ansible/tmp/ansible-tmp-1726867244.7594388-16029-217839489788791 `" ) && sleep 0' 15247 1726867244.76925: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867244.76944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867244.76961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867244.77030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867244.77051: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867244.77101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867244.77150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867244.77163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867244.77231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867244.79302: stdout chunk (state=3): >>>ansible-tmp-1726867244.7594388-16029-217839489788791=/root/.ansible/tmp/ansible-tmp-1726867244.7594388-16029-217839489788791 <<< 15247 1726867244.79331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867244.79395: stderr chunk (state=3): >>><<< 15247 1726867244.79399: stdout chunk (state=3): >>><<< 15247 1726867244.79497: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867244.7594388-16029-217839489788791=/root/.ansible/tmp/ansible-tmp-1726867244.7594388-16029-217839489788791 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867244.79541: variable 'ansible_module_compression' from source: unknown 15247 1726867244.79580: ANSIBALLZ: Using lock for ping 15247 1726867244.79921: ANSIBALLZ: Acquiring lock 15247 1726867244.79924: ANSIBALLZ: Lock acquired: 140393874633264 15247 1726867244.79927: ANSIBALLZ: Creating module 15247 1726867244.93439: ANSIBALLZ: Writing module into payload 15247 1726867244.93512: ANSIBALLZ: Writing module 15247 1726867244.93539: ANSIBALLZ: Renaming module 15247 1726867244.93551: ANSIBALLZ: Done creating module 15247 1726867244.93573: variable 'ansible_facts' from source: unknown 15247 1726867244.93652: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867244.7594388-16029-217839489788791/AnsiballZ_ping.py 15247 1726867244.93890: Sending initial data 15247 1726867244.93893: Sent initial data (153 bytes) 15247 1726867244.94625: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867244.94638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867244.94695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867244.94758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867244.94773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867244.94997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867244.95064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867244.96732: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867244.96885: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867244.96939: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpmai_u5mu /root/.ansible/tmp/ansible-tmp-1726867244.7594388-16029-217839489788791/AnsiballZ_ping.py <<< 15247 1726867244.96954: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867244.7594388-16029-217839489788791/AnsiballZ_ping.py" <<< 15247 1726867244.96992: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpmai_u5mu" to remote "/root/.ansible/tmp/ansible-tmp-1726867244.7594388-16029-217839489788791/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867244.7594388-16029-217839489788791/AnsiballZ_ping.py" <<< 15247 1726867244.98535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867244.98538: stdout chunk (state=3): >>><<< 15247 1726867244.98540: stderr chunk (state=3): >>><<< 15247 1726867244.98542: done transferring module to remote 15247 1726867244.98544: _low_level_execute_command(): starting 15247 1726867244.98546: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867244.7594388-16029-217839489788791/ /root/.ansible/tmp/ansible-tmp-1726867244.7594388-16029-217839489788791/AnsiballZ_ping.py && sleep 0' 15247 1726867244.99974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867244.99990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867245.00012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867245.00079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867245.01982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867245.01992: stdout chunk (state=3): >>><<< 15247 1726867245.02196: stderr chunk (state=3): >>><<< 15247 1726867245.02287: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867245.02291: _low_level_execute_command(): starting 15247 1726867245.02294: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867244.7594388-16029-217839489788791/AnsiballZ_ping.py && sleep 0' 15247 1726867245.03467: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867245.03896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867245.03969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867245.19171: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15247 1726867245.20562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867245.20586: stderr chunk (state=3): >>><<< 15247 1726867245.20595: stdout chunk (state=3): >>><<< 15247 1726867245.20620: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867245.20648: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867244.7594388-16029-217839489788791/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867245.20693: _low_level_execute_command(): starting 15247 1726867245.20696: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867244.7594388-16029-217839489788791/ > /dev/null 2>&1 && sleep 0' 15247 1726867245.21254: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867245.21360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867245.21363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867245.21429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867245.23391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867245.23394: stdout chunk (state=3): >>><<< 15247 1726867245.23397: stderr chunk (state=3): >>><<< 15247 1726867245.23399: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867245.23408: handler run complete 15247 1726867245.23411: attempt loop complete, returning result 15247 1726867245.23413: _execute() done 15247 1726867245.23415: dumping result to json 15247 1726867245.23418: done dumping result, returning 15247 1726867245.23420: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-8ce3-1923-000000000029] 15247 1726867245.23422: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000029 15247 1726867245.23756: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000029 15247 1726867245.23759: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 15247 1726867245.23824: no more pending results, returning what we have 15247 1726867245.23827: results queue empty 15247 1726867245.23828: checking for any_errors_fatal 15247 1726867245.23834: done checking for any_errors_fatal 15247 1726867245.23835: checking for max_fail_percentage 15247 1726867245.23837: done checking for max_fail_percentage 15247 1726867245.23838: checking to see if all hosts have failed and the running result is not ok 15247 1726867245.23839: done checking to see if all hosts have failed 15247 1726867245.23839: getting the remaining hosts for this loop 15247 1726867245.23841: done getting the remaining hosts for this loop 15247 1726867245.23844: getting the next task for host managed_node2 15247 1726867245.23851: done getting next task for host managed_node2 15247 1726867245.23853: ^ task is: TASK: meta (role_complete) 15247 1726867245.23855: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867245.23863: getting variables 15247 1726867245.23865: in VariableManager get_vars() 15247 1726867245.23932: Calling all_inventory to load vars for managed_node2 15247 1726867245.23935: Calling groups_inventory to load vars for managed_node2 15247 1726867245.23938: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867245.23950: Calling all_plugins_play to load vars for managed_node2 15247 1726867245.23953: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867245.23956: Calling groups_plugins_play to load vars for managed_node2 15247 1726867245.25599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867245.26439: done with get_vars() 15247 1726867245.26455: done getting variables 15247 1726867245.26517: done queuing things up, now waiting for results queue to drain 15247 1726867245.26519: results queue empty 15247 1726867245.26519: checking for any_errors_fatal 15247 1726867245.26521: done checking for any_errors_fatal 15247 1726867245.26521: checking for max_fail_percentage 15247 1726867245.26522: done checking for max_fail_percentage 15247 1726867245.26522: checking to see if all hosts have failed and the running result is not ok 15247 1726867245.26523: done checking to see if all hosts have failed 15247 1726867245.26523: getting the remaining hosts for this loop 15247 1726867245.26524: done getting the remaining hosts for this loop 15247 1726867245.26526: getting the next task for host managed_node2 15247 1726867245.26529: done getting next task for host managed_node2 15247 1726867245.26530: ^ task is: TASK: meta (flush_handlers) 15247 1726867245.26531: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867245.26533: getting variables 15247 1726867245.26533: in VariableManager get_vars() 15247 1726867245.26541: Calling all_inventory to load vars for managed_node2 15247 1726867245.26543: Calling groups_inventory to load vars for managed_node2 15247 1726867245.26544: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867245.26547: Calling all_plugins_play to load vars for managed_node2 15247 1726867245.26549: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867245.26550: Calling groups_plugins_play to load vars for managed_node2 15247 1726867245.28076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867245.30313: done with get_vars() 15247 1726867245.30328: done getting variables 15247 1726867245.30395: in VariableManager get_vars() 15247 1726867245.30411: Calling all_inventory to load vars for managed_node2 15247 1726867245.30414: Calling groups_inventory to load vars for managed_node2 15247 1726867245.30416: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867245.30421: Calling all_plugins_play to load vars for managed_node2 15247 1726867245.30424: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867245.30427: Calling groups_plugins_play to load vars for managed_node2 15247 1726867245.31354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867245.32828: done with get_vars() 15247 1726867245.32848: done queuing things up, now waiting for results queue to drain 15247 1726867245.32849: results queue empty 15247 1726867245.32851: checking for any_errors_fatal 15247 1726867245.32852: done checking for any_errors_fatal 15247 1726867245.32852: checking for max_fail_percentage 15247 1726867245.32853: done checking for max_fail_percentage 15247 1726867245.32853: checking to see if all hosts have failed and the running result is not ok 15247 1726867245.32854: done checking to see if all hosts have failed 15247 1726867245.32854: getting the remaining hosts for this loop 15247 1726867245.32855: done getting the remaining hosts for this loop 15247 1726867245.32857: getting the next task for host managed_node2 15247 1726867245.32859: done getting next task for host managed_node2 15247 1726867245.32860: ^ task is: TASK: meta (flush_handlers) 15247 1726867245.32861: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867245.32863: getting variables 15247 1726867245.32864: in VariableManager get_vars() 15247 1726867245.32872: Calling all_inventory to load vars for managed_node2 15247 1726867245.32874: Calling groups_inventory to load vars for managed_node2 15247 1726867245.32875: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867245.32880: Calling all_plugins_play to load vars for managed_node2 15247 1726867245.32882: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867245.32884: Calling groups_plugins_play to load vars for managed_node2 15247 1726867245.33536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867245.34373: done with get_vars() 15247 1726867245.34388: done getting variables 15247 1726867245.34419: in VariableManager get_vars() 15247 1726867245.34426: Calling all_inventory to load vars for managed_node2 15247 1726867245.34427: Calling groups_inventory to load vars for managed_node2 15247 1726867245.34429: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867245.34432: Calling all_plugins_play to load vars for managed_node2 15247 1726867245.34433: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867245.34435: Calling groups_plugins_play to load vars for managed_node2 15247 1726867245.35598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867245.37304: done with get_vars() 15247 1726867245.37328: done queuing things up, now waiting for results queue to drain 15247 1726867245.37332: results queue empty 15247 1726867245.37333: checking for any_errors_fatal 15247 1726867245.37334: done checking for any_errors_fatal 15247 1726867245.37334: checking for max_fail_percentage 15247 1726867245.37335: done checking for max_fail_percentage 15247 1726867245.37336: checking to see if all hosts have failed and the running result is not ok 15247 1726867245.37337: done checking to see if all hosts have failed 15247 1726867245.37337: getting the remaining hosts for this loop 15247 1726867245.37338: done getting the remaining hosts for this loop 15247 1726867245.37341: getting the next task for host managed_node2 15247 1726867245.37347: done getting next task for host managed_node2 15247 1726867245.37348: ^ task is: None 15247 1726867245.37349: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867245.37350: done queuing things up, now waiting for results queue to drain 15247 1726867245.37351: results queue empty 15247 1726867245.37352: checking for any_errors_fatal 15247 1726867245.37353: done checking for any_errors_fatal 15247 1726867245.37353: checking for max_fail_percentage 15247 1726867245.37354: done checking for max_fail_percentage 15247 1726867245.37355: checking to see if all hosts have failed and the running result is not ok 15247 1726867245.37355: done checking to see if all hosts have failed 15247 1726867245.37356: getting the next task for host managed_node2 15247 1726867245.37363: done getting next task for host managed_node2 15247 1726867245.37364: ^ task is: None 15247 1726867245.37366: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867245.37418: in VariableManager get_vars() 15247 1726867245.37436: done with get_vars() 15247 1726867245.37442: in VariableManager get_vars() 15247 1726867245.37456: done with get_vars() 15247 1726867245.37462: variable 'omit' from source: magic vars 15247 1726867245.37622: variable 'task' from source: play vars 15247 1726867245.37645: in VariableManager get_vars() 15247 1726867245.37656: done with get_vars() 15247 1726867245.37674: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_present.yml] ************************ 15247 1726867245.37815: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15247 1726867245.37835: getting the remaining hosts for this loop 15247 1726867245.37836: done getting the remaining hosts for this loop 15247 1726867245.37838: getting the next task for host managed_node2 15247 1726867245.37839: done getting next task for host managed_node2 15247 1726867245.37841: ^ task is: TASK: Gathering Facts 15247 1726867245.37841: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867245.37843: getting variables 15247 1726867245.37843: in VariableManager get_vars() 15247 1726867245.37849: Calling all_inventory to load vars for managed_node2 15247 1726867245.37850: Calling groups_inventory to load vars for managed_node2 15247 1726867245.37852: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867245.37855: Calling all_plugins_play to load vars for managed_node2 15247 1726867245.37856: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867245.37858: Calling groups_plugins_play to load vars for managed_node2 15247 1726867245.39033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867245.40090: done with get_vars() 15247 1726867245.40104: done getting variables 15247 1726867245.40133: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 17:20:45 -0400 (0:00:00.703) 0:00:15.111 ****** 15247 1726867245.40149: entering _queue_task() for managed_node2/gather_facts 15247 1726867245.40432: worker is 1 (out of 1 available) 15247 1726867245.40444: exiting _queue_task() for managed_node2/gather_facts 15247 1726867245.40454: done queuing things up, now waiting for results queue to drain 15247 1726867245.40455: waiting for pending results... 15247 1726867245.40693: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15247 1726867245.40778: in run() - task 0affcac9-a3a5-8ce3-1923-000000000219 15247 1726867245.40790: variable 'ansible_search_path' from source: unknown 15247 1726867245.40825: calling self._execute() 15247 1726867245.40885: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867245.40890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867245.40920: variable 'omit' from source: magic vars 15247 1726867245.41201: variable 'ansible_distribution_major_version' from source: facts 15247 1726867245.41218: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867245.41221: variable 'omit' from source: magic vars 15247 1726867245.41241: variable 'omit' from source: magic vars 15247 1726867245.41310: variable 'omit' from source: magic vars 15247 1726867245.41330: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867245.41367: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867245.41387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867245.41404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867245.41473: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867245.41476: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867245.41480: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867245.41483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867245.41615: Set connection var ansible_shell_executable to /bin/sh 15247 1726867245.41619: Set connection var ansible_connection to ssh 15247 1726867245.41622: Set connection var ansible_shell_type to sh 15247 1726867245.41624: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867245.41625: Set connection var ansible_timeout to 10 15247 1726867245.41638: Set connection var ansible_pipelining to False 15247 1726867245.41641: variable 'ansible_shell_executable' from source: unknown 15247 1726867245.41645: variable 'ansible_connection' from source: unknown 15247 1726867245.41659: variable 'ansible_module_compression' from source: unknown 15247 1726867245.41662: variable 'ansible_shell_type' from source: unknown 15247 1726867245.41665: variable 'ansible_shell_executable' from source: unknown 15247 1726867245.41667: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867245.41670: variable 'ansible_pipelining' from source: unknown 15247 1726867245.41672: variable 'ansible_timeout' from source: unknown 15247 1726867245.41675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867245.41839: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867245.41848: variable 'omit' from source: magic vars 15247 1726867245.41851: starting attempt loop 15247 1726867245.41854: running the handler 15247 1726867245.41891: variable 'ansible_facts' from source: unknown 15247 1726867245.41940: _low_level_execute_command(): starting 15247 1726867245.41943: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867245.42801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867245.42824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867245.42845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867245.42864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867245.42987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867245.43045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867245.44728: stdout chunk (state=3): >>>/root <<< 15247 1726867245.44840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867245.44868: stderr chunk (state=3): >>><<< 15247 1726867245.44871: stdout chunk (state=3): >>><<< 15247 1726867245.44898: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867245.44920: _low_level_execute_command(): starting 15247 1726867245.44924: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867245.4489112-16079-159876296845263 `" && echo ansible-tmp-1726867245.4489112-16079-159876296845263="` echo /root/.ansible/tmp/ansible-tmp-1726867245.4489112-16079-159876296845263 `" ) && sleep 0' 15247 1726867245.45649: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867245.45653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867245.45656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867245.45661: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867245.45672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867245.45793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867245.47649: stdout chunk (state=3): >>>ansible-tmp-1726867245.4489112-16079-159876296845263=/root/.ansible/tmp/ansible-tmp-1726867245.4489112-16079-159876296845263 <<< 15247 1726867245.47749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867245.47776: stderr chunk (state=3): >>><<< 15247 1726867245.47782: stdout chunk (state=3): >>><<< 15247 1726867245.47793: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867245.4489112-16079-159876296845263=/root/.ansible/tmp/ansible-tmp-1726867245.4489112-16079-159876296845263 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867245.47817: variable 'ansible_module_compression' from source: unknown 15247 1726867245.47853: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15247 1726867245.47906: variable 'ansible_facts' from source: unknown 15247 1726867245.48038: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867245.4489112-16079-159876296845263/AnsiballZ_setup.py 15247 1726867245.48363: Sending initial data 15247 1726867245.48365: Sent initial data (154 bytes) 15247 1726867245.48992: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867245.49058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867245.49065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867245.49134: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867245.49232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867245.50893: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867245.50987: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867245.51082: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp9ogo15ps /root/.ansible/tmp/ansible-tmp-1726867245.4489112-16079-159876296845263/AnsiballZ_setup.py <<< 15247 1726867245.51085: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867245.4489112-16079-159876296845263/AnsiballZ_setup.py" <<< 15247 1726867245.51147: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp9ogo15ps" to remote "/root/.ansible/tmp/ansible-tmp-1726867245.4489112-16079-159876296845263/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867245.4489112-16079-159876296845263/AnsiballZ_setup.py" <<< 15247 1726867245.54041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867245.54044: stdout chunk (state=3): >>><<< 15247 1726867245.54047: stderr chunk (state=3): >>><<< 15247 1726867245.54049: done transferring module to remote 15247 1726867245.54052: _low_level_execute_command(): starting 15247 1726867245.54055: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867245.4489112-16079-159876296845263/ /root/.ansible/tmp/ansible-tmp-1726867245.4489112-16079-159876296845263/AnsiballZ_setup.py && sleep 0' 15247 1726867245.54670: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867245.54688: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867245.54708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867245.54834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867245.54853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867245.54926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867245.56895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867245.56905: stdout chunk (state=3): >>><<< 15247 1726867245.56922: stderr chunk (state=3): >>><<< 15247 1726867245.57021: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867245.57026: _low_level_execute_command(): starting 15247 1726867245.57029: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867245.4489112-16079-159876296845263/AnsiballZ_setup.py && sleep 0' 15247 1726867245.57889: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867245.57963: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867245.57988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867245.58007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867245.58158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867246.22839: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.6103515625, "5m": 0.38818359375, "15m": 0.1904296875}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_is_chroot": false, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) <<< 15247 1726867246.22894: stdout chunk (state=3): >>>CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2963, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 568, "free": 2963}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 484, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796888576, "block_size": 4096, "block_total": 65519099, "block_available": 63915256, "block_used": 1603843, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "20", "second": "46", "epoch": "1726867246", "epoch_int": "1726867246", "date": "2024-09-20", "time": "17:20:46", "iso8601_micro": "2024-09-20T21:20:46.176891Z", "iso8601": "2024-09-20T21:20:46Z", "iso8601_basic": "20240920T172046176891", "iso8601_basic_short": "20240920T172046", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["lo", "LSR-TST-br31", "eth0"], "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "62:84:2b:2f:a5:23", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [f<<< 15247 1726867246.22928: stdout chunk (state=3): >>>ixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15247 1726867246.24870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867246.24893: stderr chunk (state=3): >>><<< 15247 1726867246.24917: stdout chunk (state=3): >>><<< 15247 1726867246.24966: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.6103515625, "5m": 0.38818359375, "15m": 0.1904296875}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_is_chroot": false, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2963, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 568, "free": 2963}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 484, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796888576, "block_size": 4096, "block_total": 65519099, "block_available": 63915256, "block_used": 1603843, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "20", "second": "46", "epoch": "1726867246", "epoch_int": "1726867246", "date": "2024-09-20", "time": "17:20:46", "iso8601_micro": "2024-09-20T21:20:46.176891Z", "iso8601": "2024-09-20T21:20:46Z", "iso8601_basic": "20240920T172046176891", "iso8601_basic_short": "20240920T172046", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_interfaces": ["lo", "LSR-TST-br31", "eth0"], "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "62:84:2b:2f:a5:23", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867246.25400: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867245.4489112-16079-159876296845263/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867246.25417: _low_level_execute_command(): starting 15247 1726867246.25484: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867245.4489112-16079-159876296845263/ > /dev/null 2>&1 && sleep 0' 15247 1726867246.26082: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867246.26095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867246.26109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867246.26127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867246.26181: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867246.26195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867246.26251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867246.26273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867246.26308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867246.28194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867246.28200: stdout chunk (state=3): >>><<< 15247 1726867246.28202: stderr chunk (state=3): >>><<< 15247 1726867246.28223: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867246.28382: handler run complete 15247 1726867246.28385: variable 'ansible_facts' from source: unknown 15247 1726867246.28481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867246.28793: variable 'ansible_facts' from source: unknown 15247 1726867246.28854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867246.28940: attempt loop complete, returning result 15247 1726867246.28943: _execute() done 15247 1726867246.28945: dumping result to json 15247 1726867246.28964: done dumping result, returning 15247 1726867246.28971: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-8ce3-1923-000000000219] 15247 1726867246.28978: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000219 15247 1726867246.29270: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000219 15247 1726867246.29273: WORKER PROCESS EXITING ok: [managed_node2] 15247 1726867246.29502: no more pending results, returning what we have 15247 1726867246.29505: results queue empty 15247 1726867246.29505: checking for any_errors_fatal 15247 1726867246.29507: done checking for any_errors_fatal 15247 1726867246.29507: checking for max_fail_percentage 15247 1726867246.29509: done checking for max_fail_percentage 15247 1726867246.29509: checking to see if all hosts have failed and the running result is not ok 15247 1726867246.29510: done checking to see if all hosts have failed 15247 1726867246.29510: getting the remaining hosts for this loop 15247 1726867246.29511: done getting the remaining hosts for this loop 15247 1726867246.29513: getting the next task for host managed_node2 15247 1726867246.29517: done getting next task for host managed_node2 15247 1726867246.29518: ^ task is: TASK: meta (flush_handlers) 15247 1726867246.29519: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867246.29522: getting variables 15247 1726867246.29523: in VariableManager get_vars() 15247 1726867246.29538: Calling all_inventory to load vars for managed_node2 15247 1726867246.29540: Calling groups_inventory to load vars for managed_node2 15247 1726867246.29542: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867246.29549: Calling all_plugins_play to load vars for managed_node2 15247 1726867246.29553: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867246.29556: Calling groups_plugins_play to load vars for managed_node2 15247 1726867246.34357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867246.35870: done with get_vars() 15247 1726867246.35898: done getting variables 15247 1726867246.35952: in VariableManager get_vars() 15247 1726867246.35961: Calling all_inventory to load vars for managed_node2 15247 1726867246.35964: Calling groups_inventory to load vars for managed_node2 15247 1726867246.35966: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867246.35971: Calling all_plugins_play to load vars for managed_node2 15247 1726867246.35973: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867246.35975: Calling groups_plugins_play to load vars for managed_node2 15247 1726867246.37087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867246.38593: done with get_vars() 15247 1726867246.38618: done queuing things up, now waiting for results queue to drain 15247 1726867246.38620: results queue empty 15247 1726867246.38621: checking for any_errors_fatal 15247 1726867246.38625: done checking for any_errors_fatal 15247 1726867246.38630: checking for max_fail_percentage 15247 1726867246.38632: done checking for max_fail_percentage 15247 1726867246.38632: checking to see if all hosts have failed and the running result is not ok 15247 1726867246.38633: done checking to see if all hosts have failed 15247 1726867246.38634: getting the remaining hosts for this loop 15247 1726867246.38635: done getting the remaining hosts for this loop 15247 1726867246.38637: getting the next task for host managed_node2 15247 1726867246.38641: done getting next task for host managed_node2 15247 1726867246.38644: ^ task is: TASK: Include the task '{{ task }}' 15247 1726867246.38645: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867246.38647: getting variables 15247 1726867246.38648: in VariableManager get_vars() 15247 1726867246.38657: Calling all_inventory to load vars for managed_node2 15247 1726867246.38659: Calling groups_inventory to load vars for managed_node2 15247 1726867246.38662: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867246.38667: Calling all_plugins_play to load vars for managed_node2 15247 1726867246.38669: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867246.38672: Calling groups_plugins_play to load vars for managed_node2 15247 1726867246.39821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867246.41312: done with get_vars() 15247 1726867246.41329: done getting variables 15247 1726867246.41472: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_present.yml'] ********************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 17:20:46 -0400 (0:00:01.013) 0:00:16.124 ****** 15247 1726867246.41500: entering _queue_task() for managed_node2/include_tasks 15247 1726867246.41835: worker is 1 (out of 1 available) 15247 1726867246.41847: exiting _queue_task() for managed_node2/include_tasks 15247 1726867246.41857: done queuing things up, now waiting for results queue to drain 15247 1726867246.41858: waiting for pending results... 15247 1726867246.42302: running TaskExecutor() for managed_node2/TASK: Include the task 'tasks/assert_device_present.yml' 15247 1726867246.42307: in run() - task 0affcac9-a3a5-8ce3-1923-00000000002d 15247 1726867246.42311: variable 'ansible_search_path' from source: unknown 15247 1726867246.42314: calling self._execute() 15247 1726867246.42394: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867246.42410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867246.42425: variable 'omit' from source: magic vars 15247 1726867246.42837: variable 'ansible_distribution_major_version' from source: facts 15247 1726867246.42841: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867246.42843: variable 'task' from source: play vars 15247 1726867246.42902: variable 'task' from source: play vars 15247 1726867246.42915: _execute() done 15247 1726867246.42924: dumping result to json 15247 1726867246.42930: done dumping result, returning 15247 1726867246.42944: done running TaskExecutor() for managed_node2/TASK: Include the task 'tasks/assert_device_present.yml' [0affcac9-a3a5-8ce3-1923-00000000002d] 15247 1726867246.42955: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000002d 15247 1726867246.43179: no more pending results, returning what we have 15247 1726867246.43186: in VariableManager get_vars() 15247 1726867246.43220: Calling all_inventory to load vars for managed_node2 15247 1726867246.43224: Calling groups_inventory to load vars for managed_node2 15247 1726867246.43228: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867246.43242: Calling all_plugins_play to load vars for managed_node2 15247 1726867246.43245: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867246.43249: Calling groups_plugins_play to load vars for managed_node2 15247 1726867246.43890: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000002d 15247 1726867246.43893: WORKER PROCESS EXITING 15247 1726867246.44642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867246.45996: done with get_vars() 15247 1726867246.46008: variable 'ansible_search_path' from source: unknown 15247 1726867246.46018: we have included files to process 15247 1726867246.46018: generating all_blocks data 15247 1726867246.46019: done generating all_blocks data 15247 1726867246.46020: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15247 1726867246.46021: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15247 1726867246.46022: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 15247 1726867246.46126: in VariableManager get_vars() 15247 1726867246.46136: done with get_vars() 15247 1726867246.46211: done processing included file 15247 1726867246.46212: iterating over new_blocks loaded from include file 15247 1726867246.46213: in VariableManager get_vars() 15247 1726867246.46220: done with get_vars() 15247 1726867246.46221: filtering new block on tags 15247 1726867246.46232: done filtering new block on tags 15247 1726867246.46233: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 15247 1726867246.46236: extending task lists for all hosts with included blocks 15247 1726867246.46255: done extending task lists 15247 1726867246.46256: done processing included files 15247 1726867246.46256: results queue empty 15247 1726867246.46257: checking for any_errors_fatal 15247 1726867246.46258: done checking for any_errors_fatal 15247 1726867246.46259: checking for max_fail_percentage 15247 1726867246.46260: done checking for max_fail_percentage 15247 1726867246.46261: checking to see if all hosts have failed and the running result is not ok 15247 1726867246.46261: done checking to see if all hosts have failed 15247 1726867246.46262: getting the remaining hosts for this loop 15247 1726867246.46262: done getting the remaining hosts for this loop 15247 1726867246.46264: getting the next task for host managed_node2 15247 1726867246.46266: done getting next task for host managed_node2 15247 1726867246.46268: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15247 1726867246.46269: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867246.46270: getting variables 15247 1726867246.46271: in VariableManager get_vars() 15247 1726867246.46276: Calling all_inventory to load vars for managed_node2 15247 1726867246.46279: Calling groups_inventory to load vars for managed_node2 15247 1726867246.46281: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867246.46284: Calling all_plugins_play to load vars for managed_node2 15247 1726867246.46285: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867246.46287: Calling groups_plugins_play to load vars for managed_node2 15247 1726867246.46974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867246.48141: done with get_vars() 15247 1726867246.48154: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:20:46 -0400 (0:00:00.067) 0:00:16.191 ****** 15247 1726867246.48221: entering _queue_task() for managed_node2/include_tasks 15247 1726867246.48470: worker is 1 (out of 1 available) 15247 1726867246.48483: exiting _queue_task() for managed_node2/include_tasks 15247 1726867246.48495: done queuing things up, now waiting for results queue to drain 15247 1726867246.48496: waiting for pending results... 15247 1726867246.48766: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 15247 1726867246.48897: in run() - task 0affcac9-a3a5-8ce3-1923-00000000022a 15247 1726867246.48917: variable 'ansible_search_path' from source: unknown 15247 1726867246.48925: variable 'ansible_search_path' from source: unknown 15247 1726867246.48962: calling self._execute() 15247 1726867246.49052: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867246.49064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867246.49080: variable 'omit' from source: magic vars 15247 1726867246.49462: variable 'ansible_distribution_major_version' from source: facts 15247 1726867246.49471: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867246.49478: _execute() done 15247 1726867246.49482: dumping result to json 15247 1726867246.49484: done dumping result, returning 15247 1726867246.49492: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-8ce3-1923-00000000022a] 15247 1726867246.49497: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000022a 15247 1726867246.49596: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000022a 15247 1726867246.49600: WORKER PROCESS EXITING 15247 1726867246.49627: no more pending results, returning what we have 15247 1726867246.49632: in VariableManager get_vars() 15247 1726867246.49664: Calling all_inventory to load vars for managed_node2 15247 1726867246.49666: Calling groups_inventory to load vars for managed_node2 15247 1726867246.49670: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867246.49682: Calling all_plugins_play to load vars for managed_node2 15247 1726867246.49685: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867246.49688: Calling groups_plugins_play to load vars for managed_node2 15247 1726867246.50439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867246.51599: done with get_vars() 15247 1726867246.51616: variable 'ansible_search_path' from source: unknown 15247 1726867246.51617: variable 'ansible_search_path' from source: unknown 15247 1726867246.51625: variable 'task' from source: play vars 15247 1726867246.51731: variable 'task' from source: play vars 15247 1726867246.51763: we have included files to process 15247 1726867246.51764: generating all_blocks data 15247 1726867246.51766: done generating all_blocks data 15247 1726867246.51767: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15247 1726867246.51768: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15247 1726867246.51770: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15247 1726867246.51951: done processing included file 15247 1726867246.51953: iterating over new_blocks loaded from include file 15247 1726867246.51955: in VariableManager get_vars() 15247 1726867246.51970: done with get_vars() 15247 1726867246.51972: filtering new block on tags 15247 1726867246.51984: done filtering new block on tags 15247 1726867246.51986: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 15247 1726867246.51992: extending task lists for all hosts with included blocks 15247 1726867246.52075: done extending task lists 15247 1726867246.52076: done processing included files 15247 1726867246.52076: results queue empty 15247 1726867246.52079: checking for any_errors_fatal 15247 1726867246.52081: done checking for any_errors_fatal 15247 1726867246.52082: checking for max_fail_percentage 15247 1726867246.52082: done checking for max_fail_percentage 15247 1726867246.52083: checking to see if all hosts have failed and the running result is not ok 15247 1726867246.52083: done checking to see if all hosts have failed 15247 1726867246.52084: getting the remaining hosts for this loop 15247 1726867246.52084: done getting the remaining hosts for this loop 15247 1726867246.52086: getting the next task for host managed_node2 15247 1726867246.52088: done getting next task for host managed_node2 15247 1726867246.52090: ^ task is: TASK: Get stat for interface {{ interface }} 15247 1726867246.52092: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867246.52093: getting variables 15247 1726867246.52093: in VariableManager get_vars() 15247 1726867246.52099: Calling all_inventory to load vars for managed_node2 15247 1726867246.52101: Calling groups_inventory to load vars for managed_node2 15247 1726867246.52102: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867246.52105: Calling all_plugins_play to load vars for managed_node2 15247 1726867246.52107: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867246.52109: Calling groups_plugins_play to load vars for managed_node2 15247 1726867246.52746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867246.53581: done with get_vars() 15247 1726867246.53594: done getting variables 15247 1726867246.53676: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:20:46 -0400 (0:00:00.054) 0:00:16.246 ****** 15247 1726867246.53697: entering _queue_task() for managed_node2/stat 15247 1726867246.53901: worker is 1 (out of 1 available) 15247 1726867246.53912: exiting _queue_task() for managed_node2/stat 15247 1726867246.53923: done queuing things up, now waiting for results queue to drain 15247 1726867246.53924: waiting for pending results... 15247 1726867246.54094: running TaskExecutor() for managed_node2/TASK: Get stat for interface LSR-TST-br31 15247 1726867246.54215: in run() - task 0affcac9-a3a5-8ce3-1923-000000000235 15247 1726867246.54219: variable 'ansible_search_path' from source: unknown 15247 1726867246.54221: variable 'ansible_search_path' from source: unknown 15247 1726867246.54288: calling self._execute() 15247 1726867246.54363: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867246.54369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867246.54372: variable 'omit' from source: magic vars 15247 1726867246.54780: variable 'ansible_distribution_major_version' from source: facts 15247 1726867246.54788: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867246.54794: variable 'omit' from source: magic vars 15247 1726867246.54854: variable 'omit' from source: magic vars 15247 1726867246.55025: variable 'interface' from source: set_fact 15247 1726867246.55042: variable 'omit' from source: magic vars 15247 1726867246.55046: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867246.55110: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867246.55114: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867246.55144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867246.55147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867246.55164: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867246.55167: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867246.55171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867246.55245: Set connection var ansible_shell_executable to /bin/sh 15247 1726867246.55249: Set connection var ansible_connection to ssh 15247 1726867246.55252: Set connection var ansible_shell_type to sh 15247 1726867246.55255: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867246.55262: Set connection var ansible_timeout to 10 15247 1726867246.55266: Set connection var ansible_pipelining to False 15247 1726867246.55284: variable 'ansible_shell_executable' from source: unknown 15247 1726867246.55288: variable 'ansible_connection' from source: unknown 15247 1726867246.55290: variable 'ansible_module_compression' from source: unknown 15247 1726867246.55292: variable 'ansible_shell_type' from source: unknown 15247 1726867246.55295: variable 'ansible_shell_executable' from source: unknown 15247 1726867246.55298: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867246.55300: variable 'ansible_pipelining' from source: unknown 15247 1726867246.55303: variable 'ansible_timeout' from source: unknown 15247 1726867246.55312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867246.55661: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867246.55665: variable 'omit' from source: magic vars 15247 1726867246.55669: starting attempt loop 15247 1726867246.55700: running the handler 15247 1726867246.55703: _low_level_execute_command(): starting 15247 1726867246.55705: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867246.56402: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867246.56427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867246.56449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867246.56541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867246.58219: stdout chunk (state=3): >>>/root <<< 15247 1726867246.58314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867246.58354: stderr chunk (state=3): >>><<< 15247 1726867246.58358: stdout chunk (state=3): >>><<< 15247 1726867246.58441: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867246.58447: _low_level_execute_command(): starting 15247 1726867246.58451: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867246.583914-16127-39646390962855 `" && echo ansible-tmp-1726867246.583914-16127-39646390962855="` echo /root/.ansible/tmp/ansible-tmp-1726867246.583914-16127-39646390962855 `" ) && sleep 0' 15247 1726867246.58969: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867246.59049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867246.59084: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867246.59098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867246.60964: stdout chunk (state=3): >>>ansible-tmp-1726867246.583914-16127-39646390962855=/root/.ansible/tmp/ansible-tmp-1726867246.583914-16127-39646390962855 <<< 15247 1726867246.61075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867246.61098: stderr chunk (state=3): >>><<< 15247 1726867246.61101: stdout chunk (state=3): >>><<< 15247 1726867246.61114: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867246.583914-16127-39646390962855=/root/.ansible/tmp/ansible-tmp-1726867246.583914-16127-39646390962855 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867246.61148: variable 'ansible_module_compression' from source: unknown 15247 1726867246.61193: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15247 1726867246.61228: variable 'ansible_facts' from source: unknown 15247 1726867246.61276: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867246.583914-16127-39646390962855/AnsiballZ_stat.py 15247 1726867246.61369: Sending initial data 15247 1726867246.61372: Sent initial data (151 bytes) 15247 1726867246.61800: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867246.61811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867246.61814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15247 1726867246.61816: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867246.61819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867246.61937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867246.61969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867246.63556: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15247 1726867246.63563: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867246.63585: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867246.63627: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpwa7wenir /root/.ansible/tmp/ansible-tmp-1726867246.583914-16127-39646390962855/AnsiballZ_stat.py <<< 15247 1726867246.63630: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867246.583914-16127-39646390962855/AnsiballZ_stat.py" <<< 15247 1726867246.63670: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpwa7wenir" to remote "/root/.ansible/tmp/ansible-tmp-1726867246.583914-16127-39646390962855/AnsiballZ_stat.py" <<< 15247 1726867246.63673: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867246.583914-16127-39646390962855/AnsiballZ_stat.py" <<< 15247 1726867246.64340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867246.64419: stderr chunk (state=3): >>><<< 15247 1726867246.64423: stdout chunk (state=3): >>><<< 15247 1726867246.64461: done transferring module to remote 15247 1726867246.64468: _low_level_execute_command(): starting 15247 1726867246.64472: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867246.583914-16127-39646390962855/ /root/.ansible/tmp/ansible-tmp-1726867246.583914-16127-39646390962855/AnsiballZ_stat.py && sleep 0' 15247 1726867246.64908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867246.64911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867246.64914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867246.64916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867246.64918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867246.64962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867246.64966: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867246.65011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867246.66801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867246.66826: stderr chunk (state=3): >>><<< 15247 1726867246.66830: stdout chunk (state=3): >>><<< 15247 1726867246.66841: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867246.66844: _low_level_execute_command(): starting 15247 1726867246.66851: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867246.583914-16127-39646390962855/AnsiballZ_stat.py && sleep 0' 15247 1726867246.67279: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867246.67282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867246.67285: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15247 1726867246.67287: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867246.67289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867246.67359: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867246.67363: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867246.67422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867246.82561: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27703, "dev": 23, "nlink": 1, "atime": 1726867244.183225, "mtime": 1726867244.183225, "ctime": 1726867244.183225, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15247 1726867246.83882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867246.83911: stderr chunk (state=3): >>><<< 15247 1726867246.83914: stdout chunk (state=3): >>><<< 15247 1726867246.83928: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27703, "dev": 23, "nlink": 1, "atime": 1726867244.183225, "mtime": 1726867244.183225, "ctime": 1726867244.183225, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867246.83966: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867246.583914-16127-39646390962855/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867246.83974: _low_level_execute_command(): starting 15247 1726867246.83980: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867246.583914-16127-39646390962855/ > /dev/null 2>&1 && sleep 0' 15247 1726867246.84437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867246.84444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867246.84446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867246.84448: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867246.84451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867246.84495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867246.84499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867246.84545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867246.86353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867246.86374: stderr chunk (state=3): >>><<< 15247 1726867246.86382: stdout chunk (state=3): >>><<< 15247 1726867246.86393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867246.86399: handler run complete 15247 1726867246.86431: attempt loop complete, returning result 15247 1726867246.86434: _execute() done 15247 1726867246.86436: dumping result to json 15247 1726867246.86441: done dumping result, returning 15247 1726867246.86448: done running TaskExecutor() for managed_node2/TASK: Get stat for interface LSR-TST-br31 [0affcac9-a3a5-8ce3-1923-000000000235] 15247 1726867246.86454: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000235 15247 1726867246.86558: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000235 15247 1726867246.86560: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726867244.183225, "block_size": 4096, "blocks": 0, "ctime": 1726867244.183225, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27703, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "mode": "0777", "mtime": 1726867244.183225, "nlink": 1, "path": "/sys/class/net/LSR-TST-br31", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 15247 1726867246.86682: no more pending results, returning what we have 15247 1726867246.86686: results queue empty 15247 1726867246.86687: checking for any_errors_fatal 15247 1726867246.86688: done checking for any_errors_fatal 15247 1726867246.86689: checking for max_fail_percentage 15247 1726867246.86690: done checking for max_fail_percentage 15247 1726867246.86691: checking to see if all hosts have failed and the running result is not ok 15247 1726867246.86692: done checking to see if all hosts have failed 15247 1726867246.86693: getting the remaining hosts for this loop 15247 1726867246.86694: done getting the remaining hosts for this loop 15247 1726867246.86697: getting the next task for host managed_node2 15247 1726867246.86705: done getting next task for host managed_node2 15247 1726867246.86707: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 15247 1726867246.86709: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867246.86714: getting variables 15247 1726867246.86715: in VariableManager get_vars() 15247 1726867246.86740: Calling all_inventory to load vars for managed_node2 15247 1726867246.86742: Calling groups_inventory to load vars for managed_node2 15247 1726867246.86745: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867246.86754: Calling all_plugins_play to load vars for managed_node2 15247 1726867246.86756: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867246.86759: Calling groups_plugins_play to load vars for managed_node2 15247 1726867246.87613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867246.88468: done with get_vars() 15247 1726867246.88485: done getting variables 15247 1726867246.88529: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15247 1726867246.88612: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'LSR-TST-br31'] ******************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:20:46 -0400 (0:00:00.349) 0:00:16.595 ****** 15247 1726867246.88635: entering _queue_task() for managed_node2/assert 15247 1726867246.88834: worker is 1 (out of 1 available) 15247 1726867246.88846: exiting _queue_task() for managed_node2/assert 15247 1726867246.88856: done queuing things up, now waiting for results queue to drain 15247 1726867246.88857: waiting for pending results... 15247 1726867246.89027: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'LSR-TST-br31' 15247 1726867246.89098: in run() - task 0affcac9-a3a5-8ce3-1923-00000000022b 15247 1726867246.89111: variable 'ansible_search_path' from source: unknown 15247 1726867246.89116: variable 'ansible_search_path' from source: unknown 15247 1726867246.89142: calling self._execute() 15247 1726867246.89206: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867246.89214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867246.89222: variable 'omit' from source: magic vars 15247 1726867246.89479: variable 'ansible_distribution_major_version' from source: facts 15247 1726867246.89488: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867246.89493: variable 'omit' from source: magic vars 15247 1726867246.89528: variable 'omit' from source: magic vars 15247 1726867246.89595: variable 'interface' from source: set_fact 15247 1726867246.89609: variable 'omit' from source: magic vars 15247 1726867246.89643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867246.89670: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867246.89686: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867246.89699: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867246.89711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867246.89740: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867246.89743: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867246.89746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867246.89813: Set connection var ansible_shell_executable to /bin/sh 15247 1726867246.89817: Set connection var ansible_connection to ssh 15247 1726867246.89819: Set connection var ansible_shell_type to sh 15247 1726867246.89821: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867246.89829: Set connection var ansible_timeout to 10 15247 1726867246.89834: Set connection var ansible_pipelining to False 15247 1726867246.89854: variable 'ansible_shell_executable' from source: unknown 15247 1726867246.89857: variable 'ansible_connection' from source: unknown 15247 1726867246.89860: variable 'ansible_module_compression' from source: unknown 15247 1726867246.89862: variable 'ansible_shell_type' from source: unknown 15247 1726867246.89864: variable 'ansible_shell_executable' from source: unknown 15247 1726867246.89866: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867246.89868: variable 'ansible_pipelining' from source: unknown 15247 1726867246.89871: variable 'ansible_timeout' from source: unknown 15247 1726867246.89874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867246.89973: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867246.89984: variable 'omit' from source: magic vars 15247 1726867246.89989: starting attempt loop 15247 1726867246.89992: running the handler 15247 1726867246.90085: variable 'interface_stat' from source: set_fact 15247 1726867246.90099: Evaluated conditional (interface_stat.stat.exists): True 15247 1726867246.90103: handler run complete 15247 1726867246.90117: attempt loop complete, returning result 15247 1726867246.90119: _execute() done 15247 1726867246.90122: dumping result to json 15247 1726867246.90124: done dumping result, returning 15247 1726867246.90130: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'LSR-TST-br31' [0affcac9-a3a5-8ce3-1923-00000000022b] 15247 1726867246.90136: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000022b ok: [managed_node2] => { "changed": false } MSG: All assertions passed 15247 1726867246.90258: no more pending results, returning what we have 15247 1726867246.90261: results queue empty 15247 1726867246.90262: checking for any_errors_fatal 15247 1726867246.90269: done checking for any_errors_fatal 15247 1726867246.90269: checking for max_fail_percentage 15247 1726867246.90271: done checking for max_fail_percentage 15247 1726867246.90272: checking to see if all hosts have failed and the running result is not ok 15247 1726867246.90273: done checking to see if all hosts have failed 15247 1726867246.90273: getting the remaining hosts for this loop 15247 1726867246.90275: done getting the remaining hosts for this loop 15247 1726867246.90280: getting the next task for host managed_node2 15247 1726867246.90288: done getting next task for host managed_node2 15247 1726867246.90290: ^ task is: TASK: meta (flush_handlers) 15247 1726867246.90291: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867246.90295: getting variables 15247 1726867246.90296: in VariableManager get_vars() 15247 1726867246.90319: Calling all_inventory to load vars for managed_node2 15247 1726867246.90321: Calling groups_inventory to load vars for managed_node2 15247 1726867246.90324: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867246.90332: Calling all_plugins_play to load vars for managed_node2 15247 1726867246.90334: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867246.90336: Calling groups_plugins_play to load vars for managed_node2 15247 1726867246.90891: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000022b 15247 1726867246.90895: WORKER PROCESS EXITING 15247 1726867246.91092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867246.91944: done with get_vars() 15247 1726867246.91958: done getting variables 15247 1726867246.92002: in VariableManager get_vars() 15247 1726867246.92010: Calling all_inventory to load vars for managed_node2 15247 1726867246.92012: Calling groups_inventory to load vars for managed_node2 15247 1726867246.92013: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867246.92016: Calling all_plugins_play to load vars for managed_node2 15247 1726867246.92018: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867246.92019: Calling groups_plugins_play to load vars for managed_node2 15247 1726867246.92708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867246.93549: done with get_vars() 15247 1726867246.93565: done queuing things up, now waiting for results queue to drain 15247 1726867246.93566: results queue empty 15247 1726867246.93567: checking for any_errors_fatal 15247 1726867246.93568: done checking for any_errors_fatal 15247 1726867246.93568: checking for max_fail_percentage 15247 1726867246.93569: done checking for max_fail_percentage 15247 1726867246.93569: checking to see if all hosts have failed and the running result is not ok 15247 1726867246.93570: done checking to see if all hosts have failed 15247 1726867246.93574: getting the remaining hosts for this loop 15247 1726867246.93575: done getting the remaining hosts for this loop 15247 1726867246.93576: getting the next task for host managed_node2 15247 1726867246.93581: done getting next task for host managed_node2 15247 1726867246.93582: ^ task is: TASK: meta (flush_handlers) 15247 1726867246.93582: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867246.93584: getting variables 15247 1726867246.93585: in VariableManager get_vars() 15247 1726867246.93589: Calling all_inventory to load vars for managed_node2 15247 1726867246.93591: Calling groups_inventory to load vars for managed_node2 15247 1726867246.93592: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867246.93595: Calling all_plugins_play to load vars for managed_node2 15247 1726867246.93596: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867246.93598: Calling groups_plugins_play to load vars for managed_node2 15247 1726867246.94214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867246.95081: done with get_vars() 15247 1726867246.95094: done getting variables 15247 1726867246.95125: in VariableManager get_vars() 15247 1726867246.95130: Calling all_inventory to load vars for managed_node2 15247 1726867246.95131: Calling groups_inventory to load vars for managed_node2 15247 1726867246.95133: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867246.95135: Calling all_plugins_play to load vars for managed_node2 15247 1726867246.95137: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867246.95138: Calling groups_plugins_play to load vars for managed_node2 15247 1726867246.95741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867246.96578: done with get_vars() 15247 1726867246.96594: done queuing things up, now waiting for results queue to drain 15247 1726867246.96596: results queue empty 15247 1726867246.96596: checking for any_errors_fatal 15247 1726867246.96597: done checking for any_errors_fatal 15247 1726867246.96598: checking for max_fail_percentage 15247 1726867246.96598: done checking for max_fail_percentage 15247 1726867246.96599: checking to see if all hosts have failed and the running result is not ok 15247 1726867246.96599: done checking to see if all hosts have failed 15247 1726867246.96600: getting the remaining hosts for this loop 15247 1726867246.96600: done getting the remaining hosts for this loop 15247 1726867246.96602: getting the next task for host managed_node2 15247 1726867246.96603: done getting next task for host managed_node2 15247 1726867246.96604: ^ task is: None 15247 1726867246.96605: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867246.96606: done queuing things up, now waiting for results queue to drain 15247 1726867246.96607: results queue empty 15247 1726867246.96607: checking for any_errors_fatal 15247 1726867246.96608: done checking for any_errors_fatal 15247 1726867246.96608: checking for max_fail_percentage 15247 1726867246.96609: done checking for max_fail_percentage 15247 1726867246.96609: checking to see if all hosts have failed and the running result is not ok 15247 1726867246.96610: done checking to see if all hosts have failed 15247 1726867246.96610: getting the next task for host managed_node2 15247 1726867246.96611: done getting next task for host managed_node2 15247 1726867246.96612: ^ task is: None 15247 1726867246.96613: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867246.96642: in VariableManager get_vars() 15247 1726867246.96653: done with get_vars() 15247 1726867246.96656: in VariableManager get_vars() 15247 1726867246.96662: done with get_vars() 15247 1726867246.96665: variable 'omit' from source: magic vars 15247 1726867246.96740: variable 'task' from source: play vars 15247 1726867246.96760: in VariableManager get_vars() 15247 1726867246.96767: done with get_vars() 15247 1726867246.96782: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_present.yml] *********************** 15247 1726867246.96923: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15247 1726867246.96943: getting the remaining hosts for this loop 15247 1726867246.96944: done getting the remaining hosts for this loop 15247 1726867246.96946: getting the next task for host managed_node2 15247 1726867246.96947: done getting next task for host managed_node2 15247 1726867246.96948: ^ task is: TASK: Gathering Facts 15247 1726867246.96949: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867246.96950: getting variables 15247 1726867246.96951: in VariableManager get_vars() 15247 1726867246.96956: Calling all_inventory to load vars for managed_node2 15247 1726867246.96958: Calling groups_inventory to load vars for managed_node2 15247 1726867246.96959: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867246.96962: Calling all_plugins_play to load vars for managed_node2 15247 1726867246.96963: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867246.96965: Calling groups_plugins_play to load vars for managed_node2 15247 1726867246.97645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867246.98470: done with get_vars() 15247 1726867246.98484: done getting variables 15247 1726867246.98513: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 17:20:46 -0400 (0:00:00.098) 0:00:16.694 ****** 15247 1726867246.98529: entering _queue_task() for managed_node2/gather_facts 15247 1726867246.98726: worker is 1 (out of 1 available) 15247 1726867246.98738: exiting _queue_task() for managed_node2/gather_facts 15247 1726867246.98748: done queuing things up, now waiting for results queue to drain 15247 1726867246.98749: waiting for pending results... 15247 1726867246.98909: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15247 1726867246.98971: in run() - task 0affcac9-a3a5-8ce3-1923-00000000024e 15247 1726867246.98987: variable 'ansible_search_path' from source: unknown 15247 1726867246.99015: calling self._execute() 15247 1726867246.99072: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867246.99076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867246.99089: variable 'omit' from source: magic vars 15247 1726867246.99351: variable 'ansible_distribution_major_version' from source: facts 15247 1726867246.99360: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867246.99366: variable 'omit' from source: magic vars 15247 1726867246.99387: variable 'omit' from source: magic vars 15247 1726867246.99416: variable 'omit' from source: magic vars 15247 1726867246.99446: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867246.99471: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867246.99487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867246.99501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867246.99513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867246.99539: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867246.99542: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867246.99544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867246.99614: Set connection var ansible_shell_executable to /bin/sh 15247 1726867246.99618: Set connection var ansible_connection to ssh 15247 1726867246.99621: Set connection var ansible_shell_type to sh 15247 1726867246.99625: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867246.99631: Set connection var ansible_timeout to 10 15247 1726867246.99636: Set connection var ansible_pipelining to False 15247 1726867246.99658: variable 'ansible_shell_executable' from source: unknown 15247 1726867246.99661: variable 'ansible_connection' from source: unknown 15247 1726867246.99664: variable 'ansible_module_compression' from source: unknown 15247 1726867246.99667: variable 'ansible_shell_type' from source: unknown 15247 1726867246.99670: variable 'ansible_shell_executable' from source: unknown 15247 1726867246.99672: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867246.99674: variable 'ansible_pipelining' from source: unknown 15247 1726867246.99676: variable 'ansible_timeout' from source: unknown 15247 1726867246.99680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867246.99805: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867246.99816: variable 'omit' from source: magic vars 15247 1726867246.99819: starting attempt loop 15247 1726867246.99822: running the handler 15247 1726867246.99835: variable 'ansible_facts' from source: unknown 15247 1726867246.99851: _low_level_execute_command(): starting 15247 1726867246.99860: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867247.00659: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867247.00664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867247.00667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867247.00714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867247.02458: stdout chunk (state=3): >>>/root <<< 15247 1726867247.02502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867247.02551: stderr chunk (state=3): >>><<< 15247 1726867247.02571: stdout chunk (state=3): >>><<< 15247 1726867247.02609: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867247.02630: _low_level_execute_command(): starting 15247 1726867247.02710: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867247.026168-16137-26750815206053 `" && echo ansible-tmp-1726867247.026168-16137-26750815206053="` echo /root/.ansible/tmp/ansible-tmp-1726867247.026168-16137-26750815206053 `" ) && sleep 0' 15247 1726867247.03227: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867247.03243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867247.03259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867247.03282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867247.03302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867247.03401: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867247.03430: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867247.03447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867247.03470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867247.03540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867247.05434: stdout chunk (state=3): >>>ansible-tmp-1726867247.026168-16137-26750815206053=/root/.ansible/tmp/ansible-tmp-1726867247.026168-16137-26750815206053 <<< 15247 1726867247.05540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867247.05560: stderr chunk (state=3): >>><<< 15247 1726867247.05563: stdout chunk (state=3): >>><<< 15247 1726867247.05576: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867247.026168-16137-26750815206053=/root/.ansible/tmp/ansible-tmp-1726867247.026168-16137-26750815206053 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867247.05600: variable 'ansible_module_compression' from source: unknown 15247 1726867247.05642: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15247 1726867247.05686: variable 'ansible_facts' from source: unknown 15247 1726867247.05819: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867247.026168-16137-26750815206053/AnsiballZ_setup.py 15247 1726867247.05913: Sending initial data 15247 1726867247.05917: Sent initial data (152 bytes) 15247 1726867247.06339: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867247.06342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867247.06344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867247.06346: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867247.06348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867247.06350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867247.06403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867247.06407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867247.06442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867247.07995: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15247 1726867247.07998: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867247.08027: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867247.08067: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp_29x7fa0 /root/.ansible/tmp/ansible-tmp-1726867247.026168-16137-26750815206053/AnsiballZ_setup.py <<< 15247 1726867247.08070: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867247.026168-16137-26750815206053/AnsiballZ_setup.py" <<< 15247 1726867247.08110: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp_29x7fa0" to remote "/root/.ansible/tmp/ansible-tmp-1726867247.026168-16137-26750815206053/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867247.026168-16137-26750815206053/AnsiballZ_setup.py" <<< 15247 1726867247.09112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867247.09144: stderr chunk (state=3): >>><<< 15247 1726867247.09147: stdout chunk (state=3): >>><<< 15247 1726867247.09162: done transferring module to remote 15247 1726867247.09169: _low_level_execute_command(): starting 15247 1726867247.09172: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867247.026168-16137-26750815206053/ /root/.ansible/tmp/ansible-tmp-1726867247.026168-16137-26750815206053/AnsiballZ_setup.py && sleep 0' 15247 1726867247.09556: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867247.09560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867247.09571: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867247.09634: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867247.09637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867247.09669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867247.11427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867247.11449: stderr chunk (state=3): >>><<< 15247 1726867247.11452: stdout chunk (state=3): >>><<< 15247 1726867247.11465: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867247.11468: _low_level_execute_command(): starting 15247 1726867247.11473: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867247.026168-16137-26750815206053/AnsiballZ_setup.py && sleep 0' 15247 1726867247.11861: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867247.11864: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867247.11866: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 15247 1726867247.11868: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867247.11870: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867247.11922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867247.11927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867247.11968: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867247.77127: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2958, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 573, "free": 2958}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 485, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796917248, "block_size": 4096, "block_total": 65519099, "block_available": 63915263, "block_used": 1603836, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "20", "second": "47", "epoch": "1726867247", "epoch_int": "1726867247", "date": "2024-09-20", "time": "17:20:47", "iso8601_micro": "2024-09-20T21:20:47.720472Z", "iso8601": "2024-09-20T21:20:47Z", "iso8601_basic": "20240920T172047720472", "iso8601_basic_short": "20240920T172047", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.56103515625, "5m": 0.38134765625, "15m": 0.18896484375}, "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "62:84:2b:2f:a5:23", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15247 1726867247.79287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867247.79291: stdout chunk (state=3): >>><<< 15247 1726867247.79294: stderr chunk (state=3): >>><<< 15247 1726867247.79297: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2958, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 573, "free": 2958}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 485, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796917248, "block_size": 4096, "block_total": 65519099, "block_available": 63915263, "block_used": 1603836, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "20", "second": "47", "epoch": "1726867247", "epoch_int": "1726867247", "date": "2024-09-20", "time": "17:20:47", "iso8601_micro": "2024-09-20T21:20:47.720472Z", "iso8601": "2024-09-20T21:20:47Z", "iso8601_basic": "20240920T172047720472", "iso8601_basic_short": "20240920T172047", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.56103515625, "5m": 0.38134765625, "15m": 0.18896484375}, "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "62:84:2b:2f:a5:23", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867247.79696: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867247.026168-16137-26750815206053/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867247.79733: _low_level_execute_command(): starting 15247 1726867247.79751: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867247.026168-16137-26750815206053/ > /dev/null 2>&1 && sleep 0' 15247 1726867247.80439: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867247.80499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867247.80573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867247.80612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867247.80688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867247.82523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867247.82584: stderr chunk (state=3): >>><<< 15247 1726867247.82593: stdout chunk (state=3): >>><<< 15247 1726867247.82782: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867247.82786: handler run complete 15247 1726867247.82789: variable 'ansible_facts' from source: unknown 15247 1726867247.82892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867247.83267: variable 'ansible_facts' from source: unknown 15247 1726867247.83372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867247.83531: attempt loop complete, returning result 15247 1726867247.83540: _execute() done 15247 1726867247.83547: dumping result to json 15247 1726867247.83595: done dumping result, returning 15247 1726867247.83611: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-8ce3-1923-00000000024e] 15247 1726867247.83621: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000024e 15247 1726867247.84304: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000024e 15247 1726867247.84310: WORKER PROCESS EXITING ok: [managed_node2] 15247 1726867247.84762: no more pending results, returning what we have 15247 1726867247.84765: results queue empty 15247 1726867247.84766: checking for any_errors_fatal 15247 1726867247.84767: done checking for any_errors_fatal 15247 1726867247.84768: checking for max_fail_percentage 15247 1726867247.84770: done checking for max_fail_percentage 15247 1726867247.84771: checking to see if all hosts have failed and the running result is not ok 15247 1726867247.84772: done checking to see if all hosts have failed 15247 1726867247.84772: getting the remaining hosts for this loop 15247 1726867247.84773: done getting the remaining hosts for this loop 15247 1726867247.84779: getting the next task for host managed_node2 15247 1726867247.84784: done getting next task for host managed_node2 15247 1726867247.84785: ^ task is: TASK: meta (flush_handlers) 15247 1726867247.84787: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867247.84791: getting variables 15247 1726867247.84792: in VariableManager get_vars() 15247 1726867247.84816: Calling all_inventory to load vars for managed_node2 15247 1726867247.84818: Calling groups_inventory to load vars for managed_node2 15247 1726867247.84821: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867247.84831: Calling all_plugins_play to load vars for managed_node2 15247 1726867247.84834: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867247.84837: Calling groups_plugins_play to load vars for managed_node2 15247 1726867247.86189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867247.87882: done with get_vars() 15247 1726867247.87904: done getting variables 15247 1726867247.87953: in VariableManager get_vars() 15247 1726867247.87960: Calling all_inventory to load vars for managed_node2 15247 1726867247.87962: Calling groups_inventory to load vars for managed_node2 15247 1726867247.87963: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867247.87967: Calling all_plugins_play to load vars for managed_node2 15247 1726867247.87968: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867247.87970: Calling groups_plugins_play to load vars for managed_node2 15247 1726867247.88586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867247.89469: done with get_vars() 15247 1726867247.89498: done queuing things up, now waiting for results queue to drain 15247 1726867247.89500: results queue empty 15247 1726867247.89501: checking for any_errors_fatal 15247 1726867247.89504: done checking for any_errors_fatal 15247 1726867247.89505: checking for max_fail_percentage 15247 1726867247.89506: done checking for max_fail_percentage 15247 1726867247.89506: checking to see if all hosts have failed and the running result is not ok 15247 1726867247.89507: done checking to see if all hosts have failed 15247 1726867247.89511: getting the remaining hosts for this loop 15247 1726867247.89512: done getting the remaining hosts for this loop 15247 1726867247.89515: getting the next task for host managed_node2 15247 1726867247.89519: done getting next task for host managed_node2 15247 1726867247.89521: ^ task is: TASK: Include the task '{{ task }}' 15247 1726867247.89523: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867247.89525: getting variables 15247 1726867247.89526: in VariableManager get_vars() 15247 1726867247.89535: Calling all_inventory to load vars for managed_node2 15247 1726867247.89537: Calling groups_inventory to load vars for managed_node2 15247 1726867247.89539: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867247.89548: Calling all_plugins_play to load vars for managed_node2 15247 1726867247.89551: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867247.89554: Calling groups_plugins_play to load vars for managed_node2 15247 1726867247.90705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867247.92064: done with get_vars() 15247 1726867247.92080: done getting variables 15247 1726867247.92202: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_present.yml'] ********************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 17:20:47 -0400 (0:00:00.936) 0:00:17.631 ****** 15247 1726867247.92226: entering _queue_task() for managed_node2/include_tasks 15247 1726867247.92474: worker is 1 (out of 1 available) 15247 1726867247.92487: exiting _queue_task() for managed_node2/include_tasks 15247 1726867247.92499: done queuing things up, now waiting for results queue to drain 15247 1726867247.92501: waiting for pending results... 15247 1726867247.92666: running TaskExecutor() for managed_node2/TASK: Include the task 'tasks/assert_profile_present.yml' 15247 1726867247.92740: in run() - task 0affcac9-a3a5-8ce3-1923-000000000031 15247 1726867247.92750: variable 'ansible_search_path' from source: unknown 15247 1726867247.92780: calling self._execute() 15247 1726867247.92851: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867247.92857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867247.92866: variable 'omit' from source: magic vars 15247 1726867247.93142: variable 'ansible_distribution_major_version' from source: facts 15247 1726867247.93150: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867247.93156: variable 'task' from source: play vars 15247 1726867247.93207: variable 'task' from source: play vars 15247 1726867247.93217: _execute() done 15247 1726867247.93221: dumping result to json 15247 1726867247.93223: done dumping result, returning 15247 1726867247.93229: done running TaskExecutor() for managed_node2/TASK: Include the task 'tasks/assert_profile_present.yml' [0affcac9-a3a5-8ce3-1923-000000000031] 15247 1726867247.93235: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000031 15247 1726867247.93318: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000031 15247 1726867247.93320: WORKER PROCESS EXITING 15247 1726867247.93343: no more pending results, returning what we have 15247 1726867247.93348: in VariableManager get_vars() 15247 1726867247.93380: Calling all_inventory to load vars for managed_node2 15247 1726867247.93383: Calling groups_inventory to load vars for managed_node2 15247 1726867247.93386: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867247.93399: Calling all_plugins_play to load vars for managed_node2 15247 1726867247.93401: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867247.93404: Calling groups_plugins_play to load vars for managed_node2 15247 1726867247.94480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867247.95471: done with get_vars() 15247 1726867247.95486: variable 'ansible_search_path' from source: unknown 15247 1726867247.95496: we have included files to process 15247 1726867247.95497: generating all_blocks data 15247 1726867247.95498: done generating all_blocks data 15247 1726867247.95498: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15247 1726867247.95499: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15247 1726867247.95500: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 15247 1726867247.95623: in VariableManager get_vars() 15247 1726867247.95634: done with get_vars() 15247 1726867247.95801: done processing included file 15247 1726867247.95802: iterating over new_blocks loaded from include file 15247 1726867247.95803: in VariableManager get_vars() 15247 1726867247.95811: done with get_vars() 15247 1726867247.95812: filtering new block on tags 15247 1726867247.95825: done filtering new block on tags 15247 1726867247.95826: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node2 15247 1726867247.95830: extending task lists for all hosts with included blocks 15247 1726867247.95847: done extending task lists 15247 1726867247.95848: done processing included files 15247 1726867247.95848: results queue empty 15247 1726867247.95849: checking for any_errors_fatal 15247 1726867247.95850: done checking for any_errors_fatal 15247 1726867247.95850: checking for max_fail_percentage 15247 1726867247.95851: done checking for max_fail_percentage 15247 1726867247.95851: checking to see if all hosts have failed and the running result is not ok 15247 1726867247.95852: done checking to see if all hosts have failed 15247 1726867247.95852: getting the remaining hosts for this loop 15247 1726867247.95853: done getting the remaining hosts for this loop 15247 1726867247.95854: getting the next task for host managed_node2 15247 1726867247.95857: done getting next task for host managed_node2 15247 1726867247.95858: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15247 1726867247.95860: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867247.95861: getting variables 15247 1726867247.95862: in VariableManager get_vars() 15247 1726867247.95866: Calling all_inventory to load vars for managed_node2 15247 1726867247.95868: Calling groups_inventory to load vars for managed_node2 15247 1726867247.95869: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867247.95873: Calling all_plugins_play to load vars for managed_node2 15247 1726867247.95875: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867247.95879: Calling groups_plugins_play to load vars for managed_node2 15247 1726867247.96545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867247.97716: done with get_vars() 15247 1726867247.97737: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 17:20:47 -0400 (0:00:00.055) 0:00:17.687 ****** 15247 1726867247.97807: entering _queue_task() for managed_node2/include_tasks 15247 1726867247.98065: worker is 1 (out of 1 available) 15247 1726867247.98078: exiting _queue_task() for managed_node2/include_tasks 15247 1726867247.98090: done queuing things up, now waiting for results queue to drain 15247 1726867247.98091: waiting for pending results... 15247 1726867247.98252: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 15247 1726867247.98376: in run() - task 0affcac9-a3a5-8ce3-1923-00000000025f 15247 1726867247.98422: variable 'ansible_search_path' from source: unknown 15247 1726867247.98430: variable 'ansible_search_path' from source: unknown 15247 1726867247.98467: calling self._execute() 15247 1726867247.98564: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867247.98567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867247.98570: variable 'omit' from source: magic vars 15247 1726867247.98907: variable 'ansible_distribution_major_version' from source: facts 15247 1726867247.98930: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867247.98943: _execute() done 15247 1726867247.98958: dumping result to json 15247 1726867247.98968: done dumping result, returning 15247 1726867247.98981: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-8ce3-1923-00000000025f] 15247 1726867247.98992: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000025f 15247 1726867247.99095: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000025f 15247 1726867247.99098: WORKER PROCESS EXITING 15247 1726867247.99154: no more pending results, returning what we have 15247 1726867247.99165: in VariableManager get_vars() 15247 1726867247.99212: Calling all_inventory to load vars for managed_node2 15247 1726867247.99215: Calling groups_inventory to load vars for managed_node2 15247 1726867247.99218: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867247.99237: Calling all_plugins_play to load vars for managed_node2 15247 1726867247.99243: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867247.99246: Calling groups_plugins_play to load vars for managed_node2 15247 1726867248.00400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867248.01696: done with get_vars() 15247 1726867248.01709: variable 'ansible_search_path' from source: unknown 15247 1726867248.01710: variable 'ansible_search_path' from source: unknown 15247 1726867248.01716: variable 'task' from source: play vars 15247 1726867248.01790: variable 'task' from source: play vars 15247 1726867248.01814: we have included files to process 15247 1726867248.01814: generating all_blocks data 15247 1726867248.01816: done generating all_blocks data 15247 1726867248.01816: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15247 1726867248.01817: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15247 1726867248.01818: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15247 1726867248.02651: done processing included file 15247 1726867248.02653: iterating over new_blocks loaded from include file 15247 1726867248.02654: in VariableManager get_vars() 15247 1726867248.02662: done with get_vars() 15247 1726867248.02663: filtering new block on tags 15247 1726867248.02676: done filtering new block on tags 15247 1726867248.02679: in VariableManager get_vars() 15247 1726867248.02686: done with get_vars() 15247 1726867248.02687: filtering new block on tags 15247 1726867248.02699: done filtering new block on tags 15247 1726867248.02700: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 15247 1726867248.02703: extending task lists for all hosts with included blocks 15247 1726867248.02798: done extending task lists 15247 1726867248.02799: done processing included files 15247 1726867248.02799: results queue empty 15247 1726867248.02800: checking for any_errors_fatal 15247 1726867248.02802: done checking for any_errors_fatal 15247 1726867248.02803: checking for max_fail_percentage 15247 1726867248.02803: done checking for max_fail_percentage 15247 1726867248.02804: checking to see if all hosts have failed and the running result is not ok 15247 1726867248.02804: done checking to see if all hosts have failed 15247 1726867248.02805: getting the remaining hosts for this loop 15247 1726867248.02807: done getting the remaining hosts for this loop 15247 1726867248.02809: getting the next task for host managed_node2 15247 1726867248.02812: done getting next task for host managed_node2 15247 1726867248.02813: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15247 1726867248.02815: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867248.02816: getting variables 15247 1726867248.02817: in VariableManager get_vars() 15247 1726867248.06908: Calling all_inventory to load vars for managed_node2 15247 1726867248.06911: Calling groups_inventory to load vars for managed_node2 15247 1726867248.06914: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867248.06920: Calling all_plugins_play to load vars for managed_node2 15247 1726867248.06922: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867248.06925: Calling groups_plugins_play to load vars for managed_node2 15247 1726867248.08114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867248.09918: done with get_vars() 15247 1726867248.09940: done getting variables 15247 1726867248.09989: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:20:48 -0400 (0:00:00.122) 0:00:17.809 ****** 15247 1726867248.10020: entering _queue_task() for managed_node2/set_fact 15247 1726867248.10809: worker is 1 (out of 1 available) 15247 1726867248.10822: exiting _queue_task() for managed_node2/set_fact 15247 1726867248.10839: done queuing things up, now waiting for results queue to drain 15247 1726867248.10840: waiting for pending results... 15247 1726867248.11133: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 15247 1726867248.11282: in run() - task 0affcac9-a3a5-8ce3-1923-00000000026c 15247 1726867248.11298: variable 'ansible_search_path' from source: unknown 15247 1726867248.11302: variable 'ansible_search_path' from source: unknown 15247 1726867248.11379: calling self._execute() 15247 1726867248.11495: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867248.11498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867248.11517: variable 'omit' from source: magic vars 15247 1726867248.11954: variable 'ansible_distribution_major_version' from source: facts 15247 1726867248.11958: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867248.11960: variable 'omit' from source: magic vars 15247 1726867248.12005: variable 'omit' from source: magic vars 15247 1726867248.12182: variable 'omit' from source: magic vars 15247 1726867248.12185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867248.12189: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867248.12192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867248.12195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867248.12198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867248.12239: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867248.12248: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867248.12257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867248.12367: Set connection var ansible_shell_executable to /bin/sh 15247 1726867248.12376: Set connection var ansible_connection to ssh 15247 1726867248.12387: Set connection var ansible_shell_type to sh 15247 1726867248.12396: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867248.12410: Set connection var ansible_timeout to 10 15247 1726867248.12419: Set connection var ansible_pipelining to False 15247 1726867248.12446: variable 'ansible_shell_executable' from source: unknown 15247 1726867248.12455: variable 'ansible_connection' from source: unknown 15247 1726867248.12464: variable 'ansible_module_compression' from source: unknown 15247 1726867248.12471: variable 'ansible_shell_type' from source: unknown 15247 1726867248.12479: variable 'ansible_shell_executable' from source: unknown 15247 1726867248.12487: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867248.12495: variable 'ansible_pipelining' from source: unknown 15247 1726867248.12503: variable 'ansible_timeout' from source: unknown 15247 1726867248.12514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867248.12665: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867248.12684: variable 'omit' from source: magic vars 15247 1726867248.12701: starting attempt loop 15247 1726867248.12712: running the handler 15247 1726867248.12731: handler run complete 15247 1726867248.12744: attempt loop complete, returning result 15247 1726867248.12751: _execute() done 15247 1726867248.12760: dumping result to json 15247 1726867248.12883: done dumping result, returning 15247 1726867248.12886: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-8ce3-1923-00000000026c] 15247 1726867248.12888: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000026c ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15247 1726867248.13015: no more pending results, returning what we have 15247 1726867248.13020: results queue empty 15247 1726867248.13021: checking for any_errors_fatal 15247 1726867248.13023: done checking for any_errors_fatal 15247 1726867248.13024: checking for max_fail_percentage 15247 1726867248.13026: done checking for max_fail_percentage 15247 1726867248.13027: checking to see if all hosts have failed and the running result is not ok 15247 1726867248.13028: done checking to see if all hosts have failed 15247 1726867248.13028: getting the remaining hosts for this loop 15247 1726867248.13030: done getting the remaining hosts for this loop 15247 1726867248.13045: getting the next task for host managed_node2 15247 1726867248.13055: done getting next task for host managed_node2 15247 1726867248.13058: ^ task is: TASK: Stat profile file 15247 1726867248.13063: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867248.13067: getting variables 15247 1726867248.13069: in VariableManager get_vars() 15247 1726867248.13101: Calling all_inventory to load vars for managed_node2 15247 1726867248.13104: Calling groups_inventory to load vars for managed_node2 15247 1726867248.13110: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867248.13122: Calling all_plugins_play to load vars for managed_node2 15247 1726867248.13124: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867248.13127: Calling groups_plugins_play to load vars for managed_node2 15247 1726867248.13713: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000026c 15247 1726867248.13717: WORKER PROCESS EXITING 15247 1726867248.14797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867248.16368: done with get_vars() 15247 1726867248.16388: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:20:48 -0400 (0:00:00.064) 0:00:17.874 ****** 15247 1726867248.16457: entering _queue_task() for managed_node2/stat 15247 1726867248.16718: worker is 1 (out of 1 available) 15247 1726867248.16731: exiting _queue_task() for managed_node2/stat 15247 1726867248.16743: done queuing things up, now waiting for results queue to drain 15247 1726867248.16745: waiting for pending results... 15247 1726867248.16921: running TaskExecutor() for managed_node2/TASK: Stat profile file 15247 1726867248.17015: in run() - task 0affcac9-a3a5-8ce3-1923-00000000026d 15247 1726867248.17025: variable 'ansible_search_path' from source: unknown 15247 1726867248.17029: variable 'ansible_search_path' from source: unknown 15247 1726867248.17057: calling self._execute() 15247 1726867248.17165: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867248.17169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867248.17172: variable 'omit' from source: magic vars 15247 1726867248.17624: variable 'ansible_distribution_major_version' from source: facts 15247 1726867248.17628: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867248.17631: variable 'omit' from source: magic vars 15247 1726867248.17667: variable 'omit' from source: magic vars 15247 1726867248.17753: variable 'profile' from source: play vars 15247 1726867248.17756: variable 'interface' from source: set_fact 15247 1726867248.17876: variable 'interface' from source: set_fact 15247 1726867248.17883: variable 'omit' from source: magic vars 15247 1726867248.17914: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867248.17942: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867248.18014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867248.18017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867248.18020: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867248.18033: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867248.18037: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867248.18046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867248.18151: Set connection var ansible_shell_executable to /bin/sh 15247 1726867248.18155: Set connection var ansible_connection to ssh 15247 1726867248.18157: Set connection var ansible_shell_type to sh 15247 1726867248.18159: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867248.18166: Set connection var ansible_timeout to 10 15247 1726867248.18171: Set connection var ansible_pipelining to False 15247 1726867248.18240: variable 'ansible_shell_executable' from source: unknown 15247 1726867248.18243: variable 'ansible_connection' from source: unknown 15247 1726867248.18245: variable 'ansible_module_compression' from source: unknown 15247 1726867248.18247: variable 'ansible_shell_type' from source: unknown 15247 1726867248.18248: variable 'ansible_shell_executable' from source: unknown 15247 1726867248.18250: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867248.18251: variable 'ansible_pipelining' from source: unknown 15247 1726867248.18253: variable 'ansible_timeout' from source: unknown 15247 1726867248.18279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867248.18514: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867248.18552: variable 'omit' from source: magic vars 15247 1726867248.18555: starting attempt loop 15247 1726867248.18586: running the handler 15247 1726867248.18594: _low_level_execute_command(): starting 15247 1726867248.18607: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867248.19744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867248.19779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867248.19857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867248.19880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867248.19988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867248.20034: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867248.20154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867248.21807: stdout chunk (state=3): >>>/root <<< 15247 1726867248.21949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867248.21952: stdout chunk (state=3): >>><<< 15247 1726867248.21954: stderr chunk (state=3): >>><<< 15247 1726867248.21973: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867248.21994: _low_level_execute_command(): starting 15247 1726867248.22065: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867248.219821-16171-256647123796633 `" && echo ansible-tmp-1726867248.219821-16171-256647123796633="` echo /root/.ansible/tmp/ansible-tmp-1726867248.219821-16171-256647123796633 `" ) && sleep 0' 15247 1726867248.22875: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867248.22926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867248.22942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867248.22969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867248.23067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867248.24956: stdout chunk (state=3): >>>ansible-tmp-1726867248.219821-16171-256647123796633=/root/.ansible/tmp/ansible-tmp-1726867248.219821-16171-256647123796633 <<< 15247 1726867248.25087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867248.25121: stderr chunk (state=3): >>><<< 15247 1726867248.25123: stdout chunk (state=3): >>><<< 15247 1726867248.25148: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867248.219821-16171-256647123796633=/root/.ansible/tmp/ansible-tmp-1726867248.219821-16171-256647123796633 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867248.25211: variable 'ansible_module_compression' from source: unknown 15247 1726867248.25283: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15247 1726867248.25333: variable 'ansible_facts' from source: unknown 15247 1726867248.25441: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867248.219821-16171-256647123796633/AnsiballZ_stat.py 15247 1726867248.25575: Sending initial data 15247 1726867248.25580: Sent initial data (152 bytes) 15247 1726867248.26135: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867248.26150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867248.26155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867248.26158: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867248.26160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867248.26215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867248.26330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867248.26415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867248.28015: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867248.28057: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867248.28089: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmppdgilpzc /root/.ansible/tmp/ansible-tmp-1726867248.219821-16171-256647123796633/AnsiballZ_stat.py <<< 15247 1726867248.28093: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867248.219821-16171-256647123796633/AnsiballZ_stat.py" <<< 15247 1726867248.28127: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmppdgilpzc" to remote "/root/.ansible/tmp/ansible-tmp-1726867248.219821-16171-256647123796633/AnsiballZ_stat.py" <<< 15247 1726867248.28130: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867248.219821-16171-256647123796633/AnsiballZ_stat.py" <<< 15247 1726867248.28652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867248.28700: stderr chunk (state=3): >>><<< 15247 1726867248.28704: stdout chunk (state=3): >>><<< 15247 1726867248.28740: done transferring module to remote 15247 1726867248.28749: _low_level_execute_command(): starting 15247 1726867248.28754: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867248.219821-16171-256647123796633/ /root/.ansible/tmp/ansible-tmp-1726867248.219821-16171-256647123796633/AnsiballZ_stat.py && sleep 0' 15247 1726867248.29360: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867248.29396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867248.31161: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867248.31333: stderr chunk (state=3): >>><<< 15247 1726867248.31337: stdout chunk (state=3): >>><<< 15247 1726867248.31345: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867248.31348: _low_level_execute_command(): starting 15247 1726867248.31350: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867248.219821-16171-256647123796633/AnsiballZ_stat.py && sleep 0' 15247 1726867248.31871: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867248.31889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867248.31905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867248.31927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867248.32000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867248.32049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867248.32069: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867248.32094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867248.32165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867248.47608: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15247 1726867248.48969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867248.48973: stdout chunk (state=3): >>><<< 15247 1726867248.48975: stderr chunk (state=3): >>><<< 15247 1726867248.48981: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867248.48984: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867248.219821-16171-256647123796633/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867248.48987: _low_level_execute_command(): starting 15247 1726867248.48989: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867248.219821-16171-256647123796633/ > /dev/null 2>&1 && sleep 0' 15247 1726867248.50123: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867248.50156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867248.50351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867248.50512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867248.52291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867248.52309: stdout chunk (state=3): >>><<< 15247 1726867248.52324: stderr chunk (state=3): >>><<< 15247 1726867248.52355: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867248.52372: handler run complete 15247 1726867248.52396: attempt loop complete, returning result 15247 1726867248.52399: _execute() done 15247 1726867248.52401: dumping result to json 15247 1726867248.52404: done dumping result, returning 15247 1726867248.52415: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0affcac9-a3a5-8ce3-1923-00000000026d] 15247 1726867248.52420: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000026d 15247 1726867248.52522: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000026d 15247 1726867248.52525: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 15247 1726867248.52580: no more pending results, returning what we have 15247 1726867248.52584: results queue empty 15247 1726867248.52585: checking for any_errors_fatal 15247 1726867248.52595: done checking for any_errors_fatal 15247 1726867248.52596: checking for max_fail_percentage 15247 1726867248.52598: done checking for max_fail_percentage 15247 1726867248.52598: checking to see if all hosts have failed and the running result is not ok 15247 1726867248.52599: done checking to see if all hosts have failed 15247 1726867248.52600: getting the remaining hosts for this loop 15247 1726867248.52601: done getting the remaining hosts for this loop 15247 1726867248.52605: getting the next task for host managed_node2 15247 1726867248.52612: done getting next task for host managed_node2 15247 1726867248.52614: ^ task is: TASK: Set NM profile exist flag based on the profile files 15247 1726867248.52618: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867248.52621: getting variables 15247 1726867248.52623: in VariableManager get_vars() 15247 1726867248.52652: Calling all_inventory to load vars for managed_node2 15247 1726867248.52654: Calling groups_inventory to load vars for managed_node2 15247 1726867248.52658: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867248.52668: Calling all_plugins_play to load vars for managed_node2 15247 1726867248.52671: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867248.52673: Calling groups_plugins_play to load vars for managed_node2 15247 1726867248.53568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867248.55140: done with get_vars() 15247 1726867248.55163: done getting variables 15247 1726867248.55234: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:20:48 -0400 (0:00:00.388) 0:00:18.262 ****** 15247 1726867248.55271: entering _queue_task() for managed_node2/set_fact 15247 1726867248.55595: worker is 1 (out of 1 available) 15247 1726867248.55610: exiting _queue_task() for managed_node2/set_fact 15247 1726867248.55623: done queuing things up, now waiting for results queue to drain 15247 1726867248.55625: waiting for pending results... 15247 1726867248.56008: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 15247 1726867248.56075: in run() - task 0affcac9-a3a5-8ce3-1923-00000000026e 15247 1726867248.56298: variable 'ansible_search_path' from source: unknown 15247 1726867248.56302: variable 'ansible_search_path' from source: unknown 15247 1726867248.56305: calling self._execute() 15247 1726867248.56307: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867248.56310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867248.56312: variable 'omit' from source: magic vars 15247 1726867248.56668: variable 'ansible_distribution_major_version' from source: facts 15247 1726867248.56688: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867248.56812: variable 'profile_stat' from source: set_fact 15247 1726867248.56830: Evaluated conditional (profile_stat.stat.exists): False 15247 1726867248.56842: when evaluation is False, skipping this task 15247 1726867248.56849: _execute() done 15247 1726867248.56856: dumping result to json 15247 1726867248.56863: done dumping result, returning 15247 1726867248.56874: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-8ce3-1923-00000000026e] 15247 1726867248.56887: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000026e skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15247 1726867248.57030: no more pending results, returning what we have 15247 1726867248.57034: results queue empty 15247 1726867248.57035: checking for any_errors_fatal 15247 1726867248.57045: done checking for any_errors_fatal 15247 1726867248.57046: checking for max_fail_percentage 15247 1726867248.57048: done checking for max_fail_percentage 15247 1726867248.57048: checking to see if all hosts have failed and the running result is not ok 15247 1726867248.57049: done checking to see if all hosts have failed 15247 1726867248.57050: getting the remaining hosts for this loop 15247 1726867248.57051: done getting the remaining hosts for this loop 15247 1726867248.57055: getting the next task for host managed_node2 15247 1726867248.57062: done getting next task for host managed_node2 15247 1726867248.57065: ^ task is: TASK: Get NM profile info 15247 1726867248.57069: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867248.57072: getting variables 15247 1726867248.57074: in VariableManager get_vars() 15247 1726867248.57105: Calling all_inventory to load vars for managed_node2 15247 1726867248.57108: Calling groups_inventory to load vars for managed_node2 15247 1726867248.57112: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867248.57126: Calling all_plugins_play to load vars for managed_node2 15247 1726867248.57128: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867248.57131: Calling groups_plugins_play to load vars for managed_node2 15247 1726867248.57890: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000026e 15247 1726867248.57894: WORKER PROCESS EXITING 15247 1726867248.58848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867248.60975: done with get_vars() 15247 1726867248.60997: done getting variables 15247 1726867248.61119: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:20:48 -0400 (0:00:00.059) 0:00:18.321 ****** 15247 1726867248.61180: entering _queue_task() for managed_node2/shell 15247 1726867248.61182: Creating lock for shell 15247 1726867248.61490: worker is 1 (out of 1 available) 15247 1726867248.61503: exiting _queue_task() for managed_node2/shell 15247 1726867248.61515: done queuing things up, now waiting for results queue to drain 15247 1726867248.61517: waiting for pending results... 15247 1726867248.61771: running TaskExecutor() for managed_node2/TASK: Get NM profile info 15247 1726867248.61904: in run() - task 0affcac9-a3a5-8ce3-1923-00000000026f 15247 1726867248.61929: variable 'ansible_search_path' from source: unknown 15247 1726867248.61937: variable 'ansible_search_path' from source: unknown 15247 1726867248.61981: calling self._execute() 15247 1726867248.62101: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867248.62105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867248.62110: variable 'omit' from source: magic vars 15247 1726867248.62503: variable 'ansible_distribution_major_version' from source: facts 15247 1726867248.62534: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867248.62538: variable 'omit' from source: magic vars 15247 1726867248.62642: variable 'omit' from source: magic vars 15247 1726867248.62931: variable 'profile' from source: play vars 15247 1726867248.62934: variable 'interface' from source: set_fact 15247 1726867248.62943: variable 'interface' from source: set_fact 15247 1726867248.62966: variable 'omit' from source: magic vars 15247 1726867248.63015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867248.63060: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867248.63088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867248.63118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867248.63135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867248.63175: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867248.63186: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867248.63193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867248.63318: Set connection var ansible_shell_executable to /bin/sh 15247 1726867248.63326: Set connection var ansible_connection to ssh 15247 1726867248.63332: Set connection var ansible_shell_type to sh 15247 1726867248.63342: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867248.63364: Set connection var ansible_timeout to 10 15247 1726867248.63384: Set connection var ansible_pipelining to False 15247 1726867248.63451: variable 'ansible_shell_executable' from source: unknown 15247 1726867248.63460: variable 'ansible_connection' from source: unknown 15247 1726867248.63468: variable 'ansible_module_compression' from source: unknown 15247 1726867248.63475: variable 'ansible_shell_type' from source: unknown 15247 1726867248.63483: variable 'ansible_shell_executable' from source: unknown 15247 1726867248.63507: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867248.63542: variable 'ansible_pipelining' from source: unknown 15247 1726867248.63547: variable 'ansible_timeout' from source: unknown 15247 1726867248.63550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867248.63763: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867248.63882: variable 'omit' from source: magic vars 15247 1726867248.63885: starting attempt loop 15247 1726867248.63887: running the handler 15247 1726867248.63890: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867248.63893: _low_level_execute_command(): starting 15247 1726867248.63895: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867248.64659: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867248.64694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867248.64734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867248.64816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867248.64822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867248.64941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867248.64982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867248.65099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867248.66731: stdout chunk (state=3): >>>/root <<< 15247 1726867248.66923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867248.66927: stdout chunk (state=3): >>><<< 15247 1726867248.66929: stderr chunk (state=3): >>><<< 15247 1726867248.67097: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867248.67100: _low_level_execute_command(): starting 15247 1726867248.67103: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867248.6696837-16188-123507769102294 `" && echo ansible-tmp-1726867248.6696837-16188-123507769102294="` echo /root/.ansible/tmp/ansible-tmp-1726867248.6696837-16188-123507769102294 `" ) && sleep 0' 15247 1726867248.68026: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867248.68034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867248.68158: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867248.68163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867248.68169: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867248.68213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867248.70106: stdout chunk (state=3): >>>ansible-tmp-1726867248.6696837-16188-123507769102294=/root/.ansible/tmp/ansible-tmp-1726867248.6696837-16188-123507769102294 <<< 15247 1726867248.70260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867248.70266: stdout chunk (state=3): >>><<< 15247 1726867248.70285: stderr chunk (state=3): >>><<< 15247 1726867248.70483: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867248.6696837-16188-123507769102294=/root/.ansible/tmp/ansible-tmp-1726867248.6696837-16188-123507769102294 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867248.70486: variable 'ansible_module_compression' from source: unknown 15247 1726867248.70488: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15247 1726867248.70490: variable 'ansible_facts' from source: unknown 15247 1726867248.70529: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867248.6696837-16188-123507769102294/AnsiballZ_command.py 15247 1726867248.70735: Sending initial data 15247 1726867248.70738: Sent initial data (156 bytes) 15247 1726867248.71265: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867248.71294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867248.71310: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867248.71371: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867248.71452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867248.71495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867248.71524: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867248.71601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867248.73157: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15247 1726867248.73175: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867248.73231: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867248.73318: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp73fcwaa8 /root/.ansible/tmp/ansible-tmp-1726867248.6696837-16188-123507769102294/AnsiballZ_command.py <<< 15247 1726867248.73351: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867248.6696837-16188-123507769102294/AnsiballZ_command.py" <<< 15247 1726867248.73438: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp73fcwaa8" to remote "/root/.ansible/tmp/ansible-tmp-1726867248.6696837-16188-123507769102294/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867248.6696837-16188-123507769102294/AnsiballZ_command.py" <<< 15247 1726867248.74801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867248.74804: stderr chunk (state=3): >>><<< 15247 1726867248.74806: stdout chunk (state=3): >>><<< 15247 1726867248.74808: done transferring module to remote 15247 1726867248.74811: _low_level_execute_command(): starting 15247 1726867248.74813: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867248.6696837-16188-123507769102294/ /root/.ansible/tmp/ansible-tmp-1726867248.6696837-16188-123507769102294/AnsiballZ_command.py && sleep 0' 15247 1726867248.75399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867248.75462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867248.75533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867248.75579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867248.75663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867248.77457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867248.77508: stderr chunk (state=3): >>><<< 15247 1726867248.77522: stdout chunk (state=3): >>><<< 15247 1726867248.77545: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867248.77639: _low_level_execute_command(): starting 15247 1726867248.77643: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867248.6696837-16188-123507769102294/AnsiballZ_command.py && sleep 0' 15247 1726867248.78234: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867248.78257: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867248.78274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867248.78313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867248.78361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867248.78431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867248.78458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867248.78504: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867248.78693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867248.95851: stdout chunk (state=3): >>> {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 17:20:48.939586", "end": "2024-09-20 17:20:48.957351", "delta": "0:00:00.017765", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15247 1726867248.97566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867248.97595: stdout chunk (state=3): >>><<< 15247 1726867248.97599: stderr chunk (state=3): >>><<< 15247 1726867248.97739: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 17:20:48.939586", "end": "2024-09-20 17:20:48.957351", "delta": "0:00:00.017765", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867248.97743: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867248.6696837-16188-123507769102294/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867248.97745: _low_level_execute_command(): starting 15247 1726867248.97748: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867248.6696837-16188-123507769102294/ > /dev/null 2>&1 && sleep 0' 15247 1726867248.98287: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867248.98304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867248.98322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867248.98340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867248.98362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867248.98394: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867248.98410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867248.98486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867248.98600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867248.98656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867249.00593: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867249.00597: stdout chunk (state=3): >>><<< 15247 1726867249.00603: stderr chunk (state=3): >>><<< 15247 1726867249.00622: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867249.00629: handler run complete 15247 1726867249.00655: Evaluated conditional (False): False 15247 1726867249.00665: attempt loop complete, returning result 15247 1726867249.00667: _execute() done 15247 1726867249.00670: dumping result to json 15247 1726867249.00675: done dumping result, returning 15247 1726867249.00687: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0affcac9-a3a5-8ce3-1923-00000000026f] 15247 1726867249.00692: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000026f ok: [managed_node2] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.017765", "end": "2024-09-20 17:20:48.957351", "rc": 0, "start": "2024-09-20 17:20:48.939586" } STDOUT: LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection 15247 1726867249.00946: no more pending results, returning what we have 15247 1726867249.00949: results queue empty 15247 1726867249.00950: checking for any_errors_fatal 15247 1726867249.00956: done checking for any_errors_fatal 15247 1726867249.00957: checking for max_fail_percentage 15247 1726867249.00959: done checking for max_fail_percentage 15247 1726867249.00960: checking to see if all hosts have failed and the running result is not ok 15247 1726867249.00960: done checking to see if all hosts have failed 15247 1726867249.00961: getting the remaining hosts for this loop 15247 1726867249.00962: done getting the remaining hosts for this loop 15247 1726867249.00965: getting the next task for host managed_node2 15247 1726867249.00970: done getting next task for host managed_node2 15247 1726867249.00973: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15247 1726867249.00976: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867249.00982: getting variables 15247 1726867249.00983: in VariableManager get_vars() 15247 1726867249.01008: Calling all_inventory to load vars for managed_node2 15247 1726867249.01010: Calling groups_inventory to load vars for managed_node2 15247 1726867249.01013: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867249.01023: Calling all_plugins_play to load vars for managed_node2 15247 1726867249.01025: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867249.01027: Calling groups_plugins_play to load vars for managed_node2 15247 1726867249.01590: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000026f 15247 1726867249.01594: WORKER PROCESS EXITING 15247 1726867249.02429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867249.03963: done with get_vars() 15247 1726867249.03987: done getting variables 15247 1726867249.04045: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:20:49 -0400 (0:00:00.428) 0:00:18.750 ****** 15247 1726867249.04076: entering _queue_task() for managed_node2/set_fact 15247 1726867249.04365: worker is 1 (out of 1 available) 15247 1726867249.04376: exiting _queue_task() for managed_node2/set_fact 15247 1726867249.04590: done queuing things up, now waiting for results queue to drain 15247 1726867249.04592: waiting for pending results... 15247 1726867249.04697: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15247 1726867249.04782: in run() - task 0affcac9-a3a5-8ce3-1923-000000000270 15247 1726867249.04803: variable 'ansible_search_path' from source: unknown 15247 1726867249.04813: variable 'ansible_search_path' from source: unknown 15247 1726867249.04856: calling self._execute() 15247 1726867249.04983: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.04987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.04989: variable 'omit' from source: magic vars 15247 1726867249.05368: variable 'ansible_distribution_major_version' from source: facts 15247 1726867249.05470: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867249.05515: variable 'nm_profile_exists' from source: set_fact 15247 1726867249.05535: Evaluated conditional (nm_profile_exists.rc == 0): True 15247 1726867249.05546: variable 'omit' from source: magic vars 15247 1726867249.05609: variable 'omit' from source: magic vars 15247 1726867249.05648: variable 'omit' from source: magic vars 15247 1726867249.05694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867249.05733: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867249.05757: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867249.05780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867249.05800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867249.05833: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867249.05842: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.05850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.05958: Set connection var ansible_shell_executable to /bin/sh 15247 1726867249.05967: Set connection var ansible_connection to ssh 15247 1726867249.05973: Set connection var ansible_shell_type to sh 15247 1726867249.06007: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867249.06010: Set connection var ansible_timeout to 10 15247 1726867249.06012: Set connection var ansible_pipelining to False 15247 1726867249.06035: variable 'ansible_shell_executable' from source: unknown 15247 1726867249.06043: variable 'ansible_connection' from source: unknown 15247 1726867249.06115: variable 'ansible_module_compression' from source: unknown 15247 1726867249.06118: variable 'ansible_shell_type' from source: unknown 15247 1726867249.06120: variable 'ansible_shell_executable' from source: unknown 15247 1726867249.06122: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.06124: variable 'ansible_pipelining' from source: unknown 15247 1726867249.06126: variable 'ansible_timeout' from source: unknown 15247 1726867249.06129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.06254: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867249.06270: variable 'omit' from source: magic vars 15247 1726867249.06282: starting attempt loop 15247 1726867249.06289: running the handler 15247 1726867249.06306: handler run complete 15247 1726867249.06320: attempt loop complete, returning result 15247 1726867249.06332: _execute() done 15247 1726867249.06340: dumping result to json 15247 1726867249.06345: done dumping result, returning 15247 1726867249.06355: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-8ce3-1923-000000000270] 15247 1726867249.06363: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000270 15247 1726867249.06511: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000270 15247 1726867249.06514: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 15247 1726867249.06604: no more pending results, returning what we have 15247 1726867249.06607: results queue empty 15247 1726867249.06608: checking for any_errors_fatal 15247 1726867249.06617: done checking for any_errors_fatal 15247 1726867249.06618: checking for max_fail_percentage 15247 1726867249.06620: done checking for max_fail_percentage 15247 1726867249.06621: checking to see if all hosts have failed and the running result is not ok 15247 1726867249.06623: done checking to see if all hosts have failed 15247 1726867249.06623: getting the remaining hosts for this loop 15247 1726867249.06625: done getting the remaining hosts for this loop 15247 1726867249.06630: getting the next task for host managed_node2 15247 1726867249.06640: done getting next task for host managed_node2 15247 1726867249.06642: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15247 1726867249.06646: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867249.06651: getting variables 15247 1726867249.06653: in VariableManager get_vars() 15247 1726867249.06682: Calling all_inventory to load vars for managed_node2 15247 1726867249.06685: Calling groups_inventory to load vars for managed_node2 15247 1726867249.06688: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867249.06698: Calling all_plugins_play to load vars for managed_node2 15247 1726867249.06701: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867249.06703: Calling groups_plugins_play to load vars for managed_node2 15247 1726867249.08247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867249.09829: done with get_vars() 15247 1726867249.09849: done getting variables 15247 1726867249.09906: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15247 1726867249.10024: variable 'profile' from source: play vars 15247 1726867249.10028: variable 'interface' from source: set_fact 15247 1726867249.10089: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:20:49 -0400 (0:00:00.060) 0:00:18.810 ****** 15247 1726867249.10124: entering _queue_task() for managed_node2/command 15247 1726867249.10504: worker is 1 (out of 1 available) 15247 1726867249.10516: exiting _queue_task() for managed_node2/command 15247 1726867249.10527: done queuing things up, now waiting for results queue to drain 15247 1726867249.10528: waiting for pending results... 15247 1726867249.10771: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 15247 1726867249.11083: in run() - task 0affcac9-a3a5-8ce3-1923-000000000272 15247 1726867249.11086: variable 'ansible_search_path' from source: unknown 15247 1726867249.11089: variable 'ansible_search_path' from source: unknown 15247 1726867249.11092: calling self._execute() 15247 1726867249.11094: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.11097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.11100: variable 'omit' from source: magic vars 15247 1726867249.11652: variable 'ansible_distribution_major_version' from source: facts 15247 1726867249.11788: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867249.12084: variable 'profile_stat' from source: set_fact 15247 1726867249.12088: Evaluated conditional (profile_stat.stat.exists): False 15247 1726867249.12093: when evaluation is False, skipping this task 15247 1726867249.12098: _execute() done 15247 1726867249.12101: dumping result to json 15247 1726867249.12103: done dumping result, returning 15247 1726867249.12109: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [0affcac9-a3a5-8ce3-1923-000000000272] 15247 1726867249.12112: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000272 skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15247 1726867249.12390: no more pending results, returning what we have 15247 1726867249.12397: results queue empty 15247 1726867249.12398: checking for any_errors_fatal 15247 1726867249.12414: done checking for any_errors_fatal 15247 1726867249.12415: checking for max_fail_percentage 15247 1726867249.12417: done checking for max_fail_percentage 15247 1726867249.12419: checking to see if all hosts have failed and the running result is not ok 15247 1726867249.12421: done checking to see if all hosts have failed 15247 1726867249.12422: getting the remaining hosts for this loop 15247 1726867249.12423: done getting the remaining hosts for this loop 15247 1726867249.12430: getting the next task for host managed_node2 15247 1726867249.12442: done getting next task for host managed_node2 15247 1726867249.12445: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15247 1726867249.12452: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867249.12458: getting variables 15247 1726867249.12461: in VariableManager get_vars() 15247 1726867249.12707: Calling all_inventory to load vars for managed_node2 15247 1726867249.12710: Calling groups_inventory to load vars for managed_node2 15247 1726867249.12715: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867249.12793: Calling all_plugins_play to load vars for managed_node2 15247 1726867249.12797: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867249.12801: Calling groups_plugins_play to load vars for managed_node2 15247 1726867249.13485: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000272 15247 1726867249.13488: WORKER PROCESS EXITING 15247 1726867249.15969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867249.19381: done with get_vars() 15247 1726867249.19408: done getting variables 15247 1726867249.19470: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15247 1726867249.19591: variable 'profile' from source: play vars 15247 1726867249.19594: variable 'interface' from source: set_fact 15247 1726867249.19654: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:20:49 -0400 (0:00:00.095) 0:00:18.906 ****** 15247 1726867249.19688: entering _queue_task() for managed_node2/set_fact 15247 1726867249.20197: worker is 1 (out of 1 available) 15247 1726867249.20210: exiting _queue_task() for managed_node2/set_fact 15247 1726867249.20221: done queuing things up, now waiting for results queue to drain 15247 1726867249.20222: waiting for pending results... 15247 1726867249.20304: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 15247 1726867249.20428: in run() - task 0affcac9-a3a5-8ce3-1923-000000000273 15247 1726867249.20453: variable 'ansible_search_path' from source: unknown 15247 1726867249.20461: variable 'ansible_search_path' from source: unknown 15247 1726867249.20503: calling self._execute() 15247 1726867249.20605: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.20623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.20641: variable 'omit' from source: magic vars 15247 1726867249.21019: variable 'ansible_distribution_major_version' from source: facts 15247 1726867249.21036: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867249.21244: variable 'profile_stat' from source: set_fact 15247 1726867249.21297: Evaluated conditional (profile_stat.stat.exists): False 15247 1726867249.21309: when evaluation is False, skipping this task 15247 1726867249.21326: _execute() done 15247 1726867249.21586: dumping result to json 15247 1726867249.21590: done dumping result, returning 15247 1726867249.21592: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [0affcac9-a3a5-8ce3-1923-000000000273] 15247 1726867249.21595: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000273 15247 1726867249.21664: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000273 15247 1726867249.21667: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15247 1726867249.21737: no more pending results, returning what we have 15247 1726867249.21741: results queue empty 15247 1726867249.21742: checking for any_errors_fatal 15247 1726867249.21750: done checking for any_errors_fatal 15247 1726867249.21751: checking for max_fail_percentage 15247 1726867249.21753: done checking for max_fail_percentage 15247 1726867249.21754: checking to see if all hosts have failed and the running result is not ok 15247 1726867249.21756: done checking to see if all hosts have failed 15247 1726867249.21756: getting the remaining hosts for this loop 15247 1726867249.21758: done getting the remaining hosts for this loop 15247 1726867249.21762: getting the next task for host managed_node2 15247 1726867249.21769: done getting next task for host managed_node2 15247 1726867249.21772: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15247 1726867249.21782: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867249.21787: getting variables 15247 1726867249.21789: in VariableManager get_vars() 15247 1726867249.21820: Calling all_inventory to load vars for managed_node2 15247 1726867249.21823: Calling groups_inventory to load vars for managed_node2 15247 1726867249.21827: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867249.21841: Calling all_plugins_play to load vars for managed_node2 15247 1726867249.21844: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867249.21847: Calling groups_plugins_play to load vars for managed_node2 15247 1726867249.23779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867249.25381: done with get_vars() 15247 1726867249.25401: done getting variables 15247 1726867249.25459: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15247 1726867249.25564: variable 'profile' from source: play vars 15247 1726867249.25568: variable 'interface' from source: set_fact 15247 1726867249.25628: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:20:49 -0400 (0:00:00.059) 0:00:18.966 ****** 15247 1726867249.25659: entering _queue_task() for managed_node2/command 15247 1726867249.25929: worker is 1 (out of 1 available) 15247 1726867249.25941: exiting _queue_task() for managed_node2/command 15247 1726867249.25951: done queuing things up, now waiting for results queue to drain 15247 1726867249.25952: waiting for pending results... 15247 1726867249.26213: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 15247 1726867249.26383: in run() - task 0affcac9-a3a5-8ce3-1923-000000000274 15247 1726867249.26387: variable 'ansible_search_path' from source: unknown 15247 1726867249.26390: variable 'ansible_search_path' from source: unknown 15247 1726867249.26402: calling self._execute() 15247 1726867249.26491: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.26510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.26527: variable 'omit' from source: magic vars 15247 1726867249.26941: variable 'ansible_distribution_major_version' from source: facts 15247 1726867249.26945: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867249.27036: variable 'profile_stat' from source: set_fact 15247 1726867249.27060: Evaluated conditional (profile_stat.stat.exists): False 15247 1726867249.27068: when evaluation is False, skipping this task 15247 1726867249.27074: _execute() done 15247 1726867249.27085: dumping result to json 15247 1726867249.27167: done dumping result, returning 15247 1726867249.27171: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [0affcac9-a3a5-8ce3-1923-000000000274] 15247 1726867249.27173: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000274 15247 1726867249.27239: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000274 15247 1726867249.27242: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15247 1726867249.27293: no more pending results, returning what we have 15247 1726867249.27296: results queue empty 15247 1726867249.27297: checking for any_errors_fatal 15247 1726867249.27309: done checking for any_errors_fatal 15247 1726867249.27310: checking for max_fail_percentage 15247 1726867249.27312: done checking for max_fail_percentage 15247 1726867249.27313: checking to see if all hosts have failed and the running result is not ok 15247 1726867249.27314: done checking to see if all hosts have failed 15247 1726867249.27314: getting the remaining hosts for this loop 15247 1726867249.27316: done getting the remaining hosts for this loop 15247 1726867249.27320: getting the next task for host managed_node2 15247 1726867249.27326: done getting next task for host managed_node2 15247 1726867249.27329: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15247 1726867249.27333: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867249.27338: getting variables 15247 1726867249.27339: in VariableManager get_vars() 15247 1726867249.27368: Calling all_inventory to load vars for managed_node2 15247 1726867249.27371: Calling groups_inventory to load vars for managed_node2 15247 1726867249.27375: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867249.27586: Calling all_plugins_play to load vars for managed_node2 15247 1726867249.27590: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867249.27593: Calling groups_plugins_play to load vars for managed_node2 15247 1726867249.29034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867249.30607: done with get_vars() 15247 1726867249.30627: done getting variables 15247 1726867249.30682: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15247 1726867249.30785: variable 'profile' from source: play vars 15247 1726867249.30789: variable 'interface' from source: set_fact 15247 1726867249.30845: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:20:49 -0400 (0:00:00.052) 0:00:19.018 ****** 15247 1726867249.30875: entering _queue_task() for managed_node2/set_fact 15247 1726867249.31148: worker is 1 (out of 1 available) 15247 1726867249.31159: exiting _queue_task() for managed_node2/set_fact 15247 1726867249.31170: done queuing things up, now waiting for results queue to drain 15247 1726867249.31171: waiting for pending results... 15247 1726867249.31587: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 15247 1726867249.31593: in run() - task 0affcac9-a3a5-8ce3-1923-000000000275 15247 1726867249.31596: variable 'ansible_search_path' from source: unknown 15247 1726867249.31598: variable 'ansible_search_path' from source: unknown 15247 1726867249.31601: calling self._execute() 15247 1726867249.31690: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.31702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.31723: variable 'omit' from source: magic vars 15247 1726867249.32087: variable 'ansible_distribution_major_version' from source: facts 15247 1726867249.32102: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867249.32216: variable 'profile_stat' from source: set_fact 15247 1726867249.32232: Evaluated conditional (profile_stat.stat.exists): False 15247 1726867249.32241: when evaluation is False, skipping this task 15247 1726867249.32248: _execute() done 15247 1726867249.32260: dumping result to json 15247 1726867249.32267: done dumping result, returning 15247 1726867249.32279: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [0affcac9-a3a5-8ce3-1923-000000000275] 15247 1726867249.32366: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000275 15247 1726867249.32436: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000275 15247 1726867249.32439: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15247 1726867249.32488: no more pending results, returning what we have 15247 1726867249.32492: results queue empty 15247 1726867249.32493: checking for any_errors_fatal 15247 1726867249.32498: done checking for any_errors_fatal 15247 1726867249.32499: checking for max_fail_percentage 15247 1726867249.32501: done checking for max_fail_percentage 15247 1726867249.32502: checking to see if all hosts have failed and the running result is not ok 15247 1726867249.32503: done checking to see if all hosts have failed 15247 1726867249.32504: getting the remaining hosts for this loop 15247 1726867249.32507: done getting the remaining hosts for this loop 15247 1726867249.32511: getting the next task for host managed_node2 15247 1726867249.32520: done getting next task for host managed_node2 15247 1726867249.32523: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 15247 1726867249.32526: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867249.32530: getting variables 15247 1726867249.32532: in VariableManager get_vars() 15247 1726867249.32561: Calling all_inventory to load vars for managed_node2 15247 1726867249.32563: Calling groups_inventory to load vars for managed_node2 15247 1726867249.32567: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867249.32779: Calling all_plugins_play to load vars for managed_node2 15247 1726867249.32783: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867249.32786: Calling groups_plugins_play to load vars for managed_node2 15247 1726867249.34032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867249.35623: done with get_vars() 15247 1726867249.35642: done getting variables 15247 1726867249.35699: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15247 1726867249.35808: variable 'profile' from source: play vars 15247 1726867249.35812: variable 'interface' from source: set_fact 15247 1726867249.35866: variable 'interface' from source: set_fact TASK [Assert that the profile is present - 'LSR-TST-br31'] ********************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 17:20:49 -0400 (0:00:00.050) 0:00:19.068 ****** 15247 1726867249.35897: entering _queue_task() for managed_node2/assert 15247 1726867249.36145: worker is 1 (out of 1 available) 15247 1726867249.36157: exiting _queue_task() for managed_node2/assert 15247 1726867249.36169: done queuing things up, now waiting for results queue to drain 15247 1726867249.36170: waiting for pending results... 15247 1726867249.36504: running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'LSR-TST-br31' 15247 1726867249.36584: in run() - task 0affcac9-a3a5-8ce3-1923-000000000260 15247 1726867249.36587: variable 'ansible_search_path' from source: unknown 15247 1726867249.36590: variable 'ansible_search_path' from source: unknown 15247 1726867249.36603: calling self._execute() 15247 1726867249.36692: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.36708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.36782: variable 'omit' from source: magic vars 15247 1726867249.37072: variable 'ansible_distribution_major_version' from source: facts 15247 1726867249.37090: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867249.37101: variable 'omit' from source: magic vars 15247 1726867249.37151: variable 'omit' from source: magic vars 15247 1726867249.37257: variable 'profile' from source: play vars 15247 1726867249.37266: variable 'interface' from source: set_fact 15247 1726867249.37338: variable 'interface' from source: set_fact 15247 1726867249.37361: variable 'omit' from source: magic vars 15247 1726867249.37403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867249.37450: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867249.37472: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867249.37546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867249.37549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867249.37551: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867249.37553: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.37555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.37661: Set connection var ansible_shell_executable to /bin/sh 15247 1726867249.37674: Set connection var ansible_connection to ssh 15247 1726867249.37683: Set connection var ansible_shell_type to sh 15247 1726867249.37693: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867249.37776: Set connection var ansible_timeout to 10 15247 1726867249.37781: Set connection var ansible_pipelining to False 15247 1726867249.37783: variable 'ansible_shell_executable' from source: unknown 15247 1726867249.37785: variable 'ansible_connection' from source: unknown 15247 1726867249.37787: variable 'ansible_module_compression' from source: unknown 15247 1726867249.37789: variable 'ansible_shell_type' from source: unknown 15247 1726867249.37790: variable 'ansible_shell_executable' from source: unknown 15247 1726867249.37792: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.37794: variable 'ansible_pipelining' from source: unknown 15247 1726867249.37797: variable 'ansible_timeout' from source: unknown 15247 1726867249.37799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.37918: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867249.37933: variable 'omit' from source: magic vars 15247 1726867249.37942: starting attempt loop 15247 1726867249.37947: running the handler 15247 1726867249.38131: variable 'lsr_net_profile_exists' from source: set_fact 15247 1726867249.38134: Evaluated conditional (lsr_net_profile_exists): True 15247 1726867249.38136: handler run complete 15247 1726867249.38139: attempt loop complete, returning result 15247 1726867249.38141: _execute() done 15247 1726867249.38143: dumping result to json 15247 1726867249.38145: done dumping result, returning 15247 1726867249.38147: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is present - 'LSR-TST-br31' [0affcac9-a3a5-8ce3-1923-000000000260] 15247 1726867249.38149: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000260 15247 1726867249.38214: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000260 15247 1726867249.38217: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 15247 1726867249.38283: no more pending results, returning what we have 15247 1726867249.38286: results queue empty 15247 1726867249.38287: checking for any_errors_fatal 15247 1726867249.38294: done checking for any_errors_fatal 15247 1726867249.38295: checking for max_fail_percentage 15247 1726867249.38296: done checking for max_fail_percentage 15247 1726867249.38297: checking to see if all hosts have failed and the running result is not ok 15247 1726867249.38298: done checking to see if all hosts have failed 15247 1726867249.38299: getting the remaining hosts for this loop 15247 1726867249.38301: done getting the remaining hosts for this loop 15247 1726867249.38304: getting the next task for host managed_node2 15247 1726867249.38314: done getting next task for host managed_node2 15247 1726867249.38318: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 15247 1726867249.38320: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867249.38325: getting variables 15247 1726867249.38327: in VariableManager get_vars() 15247 1726867249.38353: Calling all_inventory to load vars for managed_node2 15247 1726867249.38355: Calling groups_inventory to load vars for managed_node2 15247 1726867249.38359: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867249.38369: Calling all_plugins_play to load vars for managed_node2 15247 1726867249.38371: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867249.38373: Calling groups_plugins_play to load vars for managed_node2 15247 1726867249.39953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867249.42785: done with get_vars() 15247 1726867249.42813: done getting variables 15247 1726867249.42872: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15247 1726867249.42986: variable 'profile' from source: play vars 15247 1726867249.42990: variable 'interface' from source: set_fact 15247 1726867249.43050: variable 'interface' from source: set_fact TASK [Assert that the ansible managed comment is present in 'LSR-TST-br31'] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 17:20:49 -0400 (0:00:00.071) 0:00:19.140 ****** 15247 1726867249.43088: entering _queue_task() for managed_node2/assert 15247 1726867249.43382: worker is 1 (out of 1 available) 15247 1726867249.43395: exiting _queue_task() for managed_node2/assert 15247 1726867249.43408: done queuing things up, now waiting for results queue to drain 15247 1726867249.43409: waiting for pending results... 15247 1726867249.43798: running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' 15247 1726867249.43803: in run() - task 0affcac9-a3a5-8ce3-1923-000000000261 15247 1726867249.43808: variable 'ansible_search_path' from source: unknown 15247 1726867249.43811: variable 'ansible_search_path' from source: unknown 15247 1726867249.43832: calling self._execute() 15247 1726867249.43934: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.43947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.43964: variable 'omit' from source: magic vars 15247 1726867249.44350: variable 'ansible_distribution_major_version' from source: facts 15247 1726867249.44366: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867249.44375: variable 'omit' from source: magic vars 15247 1726867249.44422: variable 'omit' from source: magic vars 15247 1726867249.44529: variable 'profile' from source: play vars 15247 1726867249.44538: variable 'interface' from source: set_fact 15247 1726867249.44610: variable 'interface' from source: set_fact 15247 1726867249.44637: variable 'omit' from source: magic vars 15247 1726867249.44690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867249.44732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867249.44755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867249.44782: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867249.44798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867249.44833: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867249.44842: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.44851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.44961: Set connection var ansible_shell_executable to /bin/sh 15247 1726867249.44971: Set connection var ansible_connection to ssh 15247 1726867249.44980: Set connection var ansible_shell_type to sh 15247 1726867249.44993: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867249.45096: Set connection var ansible_timeout to 10 15247 1726867249.45099: Set connection var ansible_pipelining to False 15247 1726867249.45102: variable 'ansible_shell_executable' from source: unknown 15247 1726867249.45104: variable 'ansible_connection' from source: unknown 15247 1726867249.45108: variable 'ansible_module_compression' from source: unknown 15247 1726867249.45110: variable 'ansible_shell_type' from source: unknown 15247 1726867249.45112: variable 'ansible_shell_executable' from source: unknown 15247 1726867249.45114: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.45116: variable 'ansible_pipelining' from source: unknown 15247 1726867249.45118: variable 'ansible_timeout' from source: unknown 15247 1726867249.45120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.45229: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867249.45246: variable 'omit' from source: magic vars 15247 1726867249.45258: starting attempt loop 15247 1726867249.45265: running the handler 15247 1726867249.45376: variable 'lsr_net_profile_ansible_managed' from source: set_fact 15247 1726867249.45389: Evaluated conditional (lsr_net_profile_ansible_managed): True 15247 1726867249.45399: handler run complete 15247 1726867249.45425: attempt loop complete, returning result 15247 1726867249.45432: _execute() done 15247 1726867249.45439: dumping result to json 15247 1726867249.45447: done dumping result, returning 15247 1726867249.45458: done running TaskExecutor() for managed_node2/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' [0affcac9-a3a5-8ce3-1923-000000000261] 15247 1726867249.45469: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000261 15247 1726867249.45721: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000261 15247 1726867249.45724: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 15247 1726867249.45766: no more pending results, returning what we have 15247 1726867249.45768: results queue empty 15247 1726867249.45769: checking for any_errors_fatal 15247 1726867249.45775: done checking for any_errors_fatal 15247 1726867249.45776: checking for max_fail_percentage 15247 1726867249.45779: done checking for max_fail_percentage 15247 1726867249.45781: checking to see if all hosts have failed and the running result is not ok 15247 1726867249.45781: done checking to see if all hosts have failed 15247 1726867249.45782: getting the remaining hosts for this loop 15247 1726867249.45784: done getting the remaining hosts for this loop 15247 1726867249.45787: getting the next task for host managed_node2 15247 1726867249.45793: done getting next task for host managed_node2 15247 1726867249.45796: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 15247 1726867249.45799: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867249.45803: getting variables 15247 1726867249.45804: in VariableManager get_vars() 15247 1726867249.45834: Calling all_inventory to load vars for managed_node2 15247 1726867249.45836: Calling groups_inventory to load vars for managed_node2 15247 1726867249.45840: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867249.45850: Calling all_plugins_play to load vars for managed_node2 15247 1726867249.45853: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867249.45856: Calling groups_plugins_play to load vars for managed_node2 15247 1726867249.47252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867249.48946: done with get_vars() 15247 1726867249.48965: done getting variables 15247 1726867249.49017: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15247 1726867249.49116: variable 'profile' from source: play vars 15247 1726867249.49120: variable 'interface' from source: set_fact 15247 1726867249.49175: variable 'interface' from source: set_fact TASK [Assert that the fingerprint comment is present in LSR-TST-br31] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 17:20:49 -0400 (0:00:00.061) 0:00:19.201 ****** 15247 1726867249.49214: entering _queue_task() for managed_node2/assert 15247 1726867249.49690: worker is 1 (out of 1 available) 15247 1726867249.49703: exiting _queue_task() for managed_node2/assert 15247 1726867249.49717: done queuing things up, now waiting for results queue to drain 15247 1726867249.49719: waiting for pending results... 15247 1726867249.50233: running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 15247 1726867249.50479: in run() - task 0affcac9-a3a5-8ce3-1923-000000000262 15247 1726867249.50499: variable 'ansible_search_path' from source: unknown 15247 1726867249.50508: variable 'ansible_search_path' from source: unknown 15247 1726867249.50629: calling self._execute() 15247 1726867249.50709: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.50985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.50988: variable 'omit' from source: magic vars 15247 1726867249.51695: variable 'ansible_distribution_major_version' from source: facts 15247 1726867249.51715: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867249.51726: variable 'omit' from source: magic vars 15247 1726867249.51828: variable 'omit' from source: magic vars 15247 1726867249.52045: variable 'profile' from source: play vars 15247 1726867249.52056: variable 'interface' from source: set_fact 15247 1726867249.52129: variable 'interface' from source: set_fact 15247 1726867249.52151: variable 'omit' from source: magic vars 15247 1726867249.52194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867249.52235: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867249.52259: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867249.52281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867249.52350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867249.52353: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867249.52355: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.52357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.52448: Set connection var ansible_shell_executable to /bin/sh 15247 1726867249.52463: Set connection var ansible_connection to ssh 15247 1726867249.52470: Set connection var ansible_shell_type to sh 15247 1726867249.52484: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867249.52495: Set connection var ansible_timeout to 10 15247 1726867249.52504: Set connection var ansible_pipelining to False 15247 1726867249.52532: variable 'ansible_shell_executable' from source: unknown 15247 1726867249.52539: variable 'ansible_connection' from source: unknown 15247 1726867249.52545: variable 'ansible_module_compression' from source: unknown 15247 1726867249.52551: variable 'ansible_shell_type' from source: unknown 15247 1726867249.52565: variable 'ansible_shell_executable' from source: unknown 15247 1726867249.52568: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.52675: variable 'ansible_pipelining' from source: unknown 15247 1726867249.52680: variable 'ansible_timeout' from source: unknown 15247 1726867249.52682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.52725: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867249.52740: variable 'omit' from source: magic vars 15247 1726867249.52749: starting attempt loop 15247 1726867249.52755: running the handler 15247 1726867249.52867: variable 'lsr_net_profile_fingerprint' from source: set_fact 15247 1726867249.52879: Evaluated conditional (lsr_net_profile_fingerprint): True 15247 1726867249.52893: handler run complete 15247 1726867249.52913: attempt loop complete, returning result 15247 1726867249.52919: _execute() done 15247 1726867249.52925: dumping result to json 15247 1726867249.52934: done dumping result, returning 15247 1726867249.52944: done running TaskExecutor() for managed_node2/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 [0affcac9-a3a5-8ce3-1923-000000000262] 15247 1726867249.52953: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000262 15247 1726867249.53184: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000262 15247 1726867249.53187: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 15247 1726867249.53233: no more pending results, returning what we have 15247 1726867249.53236: results queue empty 15247 1726867249.53237: checking for any_errors_fatal 15247 1726867249.53244: done checking for any_errors_fatal 15247 1726867249.53244: checking for max_fail_percentage 15247 1726867249.53246: done checking for max_fail_percentage 15247 1726867249.53247: checking to see if all hosts have failed and the running result is not ok 15247 1726867249.53248: done checking to see if all hosts have failed 15247 1726867249.53249: getting the remaining hosts for this loop 15247 1726867249.53251: done getting the remaining hosts for this loop 15247 1726867249.53254: getting the next task for host managed_node2 15247 1726867249.53262: done getting next task for host managed_node2 15247 1726867249.53265: ^ task is: TASK: meta (flush_handlers) 15247 1726867249.53267: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867249.53271: getting variables 15247 1726867249.53273: in VariableManager get_vars() 15247 1726867249.53300: Calling all_inventory to load vars for managed_node2 15247 1726867249.53302: Calling groups_inventory to load vars for managed_node2 15247 1726867249.53309: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867249.53320: Calling all_plugins_play to load vars for managed_node2 15247 1726867249.53323: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867249.53325: Calling groups_plugins_play to load vars for managed_node2 15247 1726867249.55130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867249.58327: done with get_vars() 15247 1726867249.58351: done getting variables 15247 1726867249.58419: in VariableManager get_vars() 15247 1726867249.58429: Calling all_inventory to load vars for managed_node2 15247 1726867249.58431: Calling groups_inventory to load vars for managed_node2 15247 1726867249.58434: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867249.58438: Calling all_plugins_play to load vars for managed_node2 15247 1726867249.58440: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867249.58443: Calling groups_plugins_play to load vars for managed_node2 15247 1726867249.61102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867249.64319: done with get_vars() 15247 1726867249.64357: done queuing things up, now waiting for results queue to drain 15247 1726867249.64360: results queue empty 15247 1726867249.64361: checking for any_errors_fatal 15247 1726867249.64363: done checking for any_errors_fatal 15247 1726867249.64364: checking for max_fail_percentage 15247 1726867249.64365: done checking for max_fail_percentage 15247 1726867249.64372: checking to see if all hosts have failed and the running result is not ok 15247 1726867249.64372: done checking to see if all hosts have failed 15247 1726867249.64373: getting the remaining hosts for this loop 15247 1726867249.64374: done getting the remaining hosts for this loop 15247 1726867249.64380: getting the next task for host managed_node2 15247 1726867249.64384: done getting next task for host managed_node2 15247 1726867249.64386: ^ task is: TASK: meta (flush_handlers) 15247 1726867249.64387: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867249.64390: getting variables 15247 1726867249.64391: in VariableManager get_vars() 15247 1726867249.64402: Calling all_inventory to load vars for managed_node2 15247 1726867249.64405: Calling groups_inventory to load vars for managed_node2 15247 1726867249.64410: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867249.64415: Calling all_plugins_play to load vars for managed_node2 15247 1726867249.64418: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867249.64421: Calling groups_plugins_play to load vars for managed_node2 15247 1726867249.66799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867249.69939: done with get_vars() 15247 1726867249.69967: done getting variables 15247 1726867249.70021: in VariableManager get_vars() 15247 1726867249.70032: Calling all_inventory to load vars for managed_node2 15247 1726867249.70035: Calling groups_inventory to load vars for managed_node2 15247 1726867249.70037: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867249.70041: Calling all_plugins_play to load vars for managed_node2 15247 1726867249.70044: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867249.70046: Calling groups_plugins_play to load vars for managed_node2 15247 1726867249.72731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867249.76029: done with get_vars() 15247 1726867249.76060: done queuing things up, now waiting for results queue to drain 15247 1726867249.76062: results queue empty 15247 1726867249.76063: checking for any_errors_fatal 15247 1726867249.76065: done checking for any_errors_fatal 15247 1726867249.76065: checking for max_fail_percentage 15247 1726867249.76067: done checking for max_fail_percentage 15247 1726867249.76067: checking to see if all hosts have failed and the running result is not ok 15247 1726867249.76068: done checking to see if all hosts have failed 15247 1726867249.76069: getting the remaining hosts for this loop 15247 1726867249.76070: done getting the remaining hosts for this loop 15247 1726867249.76073: getting the next task for host managed_node2 15247 1726867249.76076: done getting next task for host managed_node2 15247 1726867249.76079: ^ task is: None 15247 1726867249.76080: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867249.76082: done queuing things up, now waiting for results queue to drain 15247 1726867249.76082: results queue empty 15247 1726867249.76083: checking for any_errors_fatal 15247 1726867249.76084: done checking for any_errors_fatal 15247 1726867249.76084: checking for max_fail_percentage 15247 1726867249.76085: done checking for max_fail_percentage 15247 1726867249.76086: checking to see if all hosts have failed and the running result is not ok 15247 1726867249.76087: done checking to see if all hosts have failed 15247 1726867249.76088: getting the next task for host managed_node2 15247 1726867249.76090: done getting next task for host managed_node2 15247 1726867249.76091: ^ task is: None 15247 1726867249.76092: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867249.76399: in VariableManager get_vars() 15247 1726867249.76423: done with get_vars() 15247 1726867249.76428: in VariableManager get_vars() 15247 1726867249.76438: done with get_vars() 15247 1726867249.76442: variable 'omit' from source: magic vars 15247 1726867249.76552: variable 'profile' from source: play vars 15247 1726867249.76668: in VariableManager get_vars() 15247 1726867249.76686: done with get_vars() 15247 1726867249.76710: variable 'omit' from source: magic vars 15247 1726867249.76772: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 15247 1726867249.77510: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15247 1726867249.77532: getting the remaining hosts for this loop 15247 1726867249.77533: done getting the remaining hosts for this loop 15247 1726867249.77536: getting the next task for host managed_node2 15247 1726867249.77539: done getting next task for host managed_node2 15247 1726867249.77540: ^ task is: TASK: Gathering Facts 15247 1726867249.77542: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867249.77544: getting variables 15247 1726867249.77545: in VariableManager get_vars() 15247 1726867249.77555: Calling all_inventory to load vars for managed_node2 15247 1726867249.77558: Calling groups_inventory to load vars for managed_node2 15247 1726867249.77560: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867249.77565: Calling all_plugins_play to load vars for managed_node2 15247 1726867249.77568: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867249.77570: Calling groups_plugins_play to load vars for managed_node2 15247 1726867249.78802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867249.80308: done with get_vars() 15247 1726867249.80326: done getting variables 15247 1726867249.80367: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 17:20:49 -0400 (0:00:00.311) 0:00:19.513 ****** 15247 1726867249.80392: entering _queue_task() for managed_node2/gather_facts 15247 1726867249.80708: worker is 1 (out of 1 available) 15247 1726867249.80720: exiting _queue_task() for managed_node2/gather_facts 15247 1726867249.80732: done queuing things up, now waiting for results queue to drain 15247 1726867249.80733: waiting for pending results... 15247 1726867249.81025: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15247 1726867249.81183: in run() - task 0affcac9-a3a5-8ce3-1923-0000000002b5 15247 1726867249.81186: variable 'ansible_search_path' from source: unknown 15247 1726867249.81188: calling self._execute() 15247 1726867249.81260: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.81271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.81289: variable 'omit' from source: magic vars 15247 1726867249.81654: variable 'ansible_distribution_major_version' from source: facts 15247 1726867249.81669: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867249.81682: variable 'omit' from source: magic vars 15247 1726867249.81745: variable 'omit' from source: magic vars 15247 1726867249.81749: variable 'omit' from source: magic vars 15247 1726867249.81790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867249.81830: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867249.81859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867249.81884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867249.81961: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867249.81964: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867249.81966: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.81968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.82048: Set connection var ansible_shell_executable to /bin/sh 15247 1726867249.82058: Set connection var ansible_connection to ssh 15247 1726867249.82064: Set connection var ansible_shell_type to sh 15247 1726867249.82073: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867249.82088: Set connection var ansible_timeout to 10 15247 1726867249.82096: Set connection var ansible_pipelining to False 15247 1726867249.82125: variable 'ansible_shell_executable' from source: unknown 15247 1726867249.82132: variable 'ansible_connection' from source: unknown 15247 1726867249.82182: variable 'ansible_module_compression' from source: unknown 15247 1726867249.82187: variable 'ansible_shell_type' from source: unknown 15247 1726867249.82190: variable 'ansible_shell_executable' from source: unknown 15247 1726867249.82192: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867249.82194: variable 'ansible_pipelining' from source: unknown 15247 1726867249.82196: variable 'ansible_timeout' from source: unknown 15247 1726867249.82197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867249.82382: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867249.82386: variable 'omit' from source: magic vars 15247 1726867249.82388: starting attempt loop 15247 1726867249.82390: running the handler 15247 1726867249.82392: variable 'ansible_facts' from source: unknown 15247 1726867249.82419: _low_level_execute_command(): starting 15247 1726867249.82434: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867249.83158: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867249.83187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867249.83288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867249.83311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867249.83702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867249.83760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867249.85444: stdout chunk (state=3): >>>/root <<< 15247 1726867249.85541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867249.85582: stderr chunk (state=3): >>><<< 15247 1726867249.85588: stdout chunk (state=3): >>><<< 15247 1726867249.85619: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867249.85876: _low_level_execute_command(): starting 15247 1726867249.85883: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867249.8578577-16229-55039208095648 `" && echo ansible-tmp-1726867249.8578577-16229-55039208095648="` echo /root/.ansible/tmp/ansible-tmp-1726867249.8578577-16229-55039208095648 `" ) && sleep 0' 15247 1726867249.86936: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867249.86949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867249.87202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867249.87247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867249.89139: stdout chunk (state=3): >>>ansible-tmp-1726867249.8578577-16229-55039208095648=/root/.ansible/tmp/ansible-tmp-1726867249.8578577-16229-55039208095648 <<< 15247 1726867249.89297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867249.89343: stderr chunk (state=3): >>><<< 15247 1726867249.89388: stdout chunk (state=3): >>><<< 15247 1726867249.89416: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867249.8578577-16229-55039208095648=/root/.ansible/tmp/ansible-tmp-1726867249.8578577-16229-55039208095648 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867249.89525: variable 'ansible_module_compression' from source: unknown 15247 1726867249.89585: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15247 1726867249.89751: variable 'ansible_facts' from source: unknown 15247 1726867249.90226: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867249.8578577-16229-55039208095648/AnsiballZ_setup.py 15247 1726867249.91021: Sending initial data 15247 1726867249.91024: Sent initial data (153 bytes) 15247 1726867249.91940: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867249.91944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867249.91947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867249.91997: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867249.92010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867249.92182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867249.92193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867249.92427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867249.92470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867249.94056: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867249.94093: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867249.94128: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpbijhnlq8 /root/.ansible/tmp/ansible-tmp-1726867249.8578577-16229-55039208095648/AnsiballZ_setup.py <<< 15247 1726867249.94137: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867249.8578577-16229-55039208095648/AnsiballZ_setup.py" <<< 15247 1726867249.94195: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpbijhnlq8" to remote "/root/.ansible/tmp/ansible-tmp-1726867249.8578577-16229-55039208095648/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867249.8578577-16229-55039208095648/AnsiballZ_setup.py" <<< 15247 1726867249.96916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867249.96929: stderr chunk (state=3): >>><<< 15247 1726867249.96937: stdout chunk (state=3): >>><<< 15247 1726867249.96975: done transferring module to remote 15247 1726867249.97098: _low_level_execute_command(): starting 15247 1726867249.97108: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867249.8578577-16229-55039208095648/ /root/.ansible/tmp/ansible-tmp-1726867249.8578577-16229-55039208095648/AnsiballZ_setup.py && sleep 0' 15247 1726867249.98405: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867249.98408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867249.98650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867249.98717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867250.00526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867250.00585: stderr chunk (state=3): >>><<< 15247 1726867250.00595: stdout chunk (state=3): >>><<< 15247 1726867250.00617: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867250.00866: _low_level_execute_command(): starting 15247 1726867250.00869: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867249.8578577-16229-55039208095648/AnsiballZ_setup.py && sleep 0' 15247 1726867250.01975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867250.02004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867250.02017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867250.02137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867250.02492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867250.02557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867250.66996: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "20", "second": "50", "epoch": "1726867250", "epoch_int": "1726867250", "date": "2024-09-20", "time": "17:20:50", "iso8601_micro": "2024-09-20T21:20:50.304197Z", "iso8601": "2024-09-20T21:20:50Z", "iso8601_basic": "20240920T172050304197", "iso8601_basic_short": "20240920T172050", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_interfaces": ["eth0", "LSR-TST-br31", "lo"], "ansible_LSR_TST_br31": {"device"<<< 15247 1726867250.67035: stdout chunk (state=3): >>>: "LSR-TST-br31", "macaddress": "62:84:2b:2f:a5:23", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_service_mgr": "syste<<< 15247 1726867250.67068: stdout chunk (state=3): >>>md", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2945, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 586, "free": 2945}, "nocache": {"free": 3282, "used": 249}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 488, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796974592, "block_size": 4096, "block_total": 65519099, "block_available": 63915277, "block_used": 1603822, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.56103515625, "5m": 0.38134765625, "15m": 0.18896484375}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15247 1726867250.69101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867250.69182: stderr chunk (state=3): >>><<< 15247 1726867250.69186: stdout chunk (state=3): >>><<< 15247 1726867250.69236: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "20", "second": "50", "epoch": "1726867250", "epoch_int": "1726867250", "date": "2024-09-20", "time": "17:20:50", "iso8601_micro": "2024-09-20T21:20:50.304197Z", "iso8601": "2024-09-20T21:20:50Z", "iso8601_basic": "20240920T172050304197", "iso8601_basic_short": "20240920T172050", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_interfaces": ["eth0", "LSR-TST-br31", "lo"], "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "62:84:2b:2f:a5:23", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2945, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 586, "free": 2945}, "nocache": {"free": 3282, "used": 249}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 488, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796974592, "block_size": 4096, "block_total": 65519099, "block_available": 63915277, "block_used": 1603822, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.56103515625, "5m": 0.38134765625, "15m": 0.18896484375}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867250.69784: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867249.8578577-16229-55039208095648/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867250.69819: _low_level_execute_command(): starting 15247 1726867250.69822: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867249.8578577-16229-55039208095648/ > /dev/null 2>&1 && sleep 0' 15247 1726867250.70470: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867250.70478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867250.70481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867250.70552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867250.70556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867250.70609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867250.72552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867250.72555: stdout chunk (state=3): >>><<< 15247 1726867250.72557: stderr chunk (state=3): >>><<< 15247 1726867250.72582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867250.72585: handler run complete 15247 1726867250.72760: variable 'ansible_facts' from source: unknown 15247 1726867250.72805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867250.73078: variable 'ansible_facts' from source: unknown 15247 1726867250.73147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867250.73241: attempt loop complete, returning result 15247 1726867250.73244: _execute() done 15247 1726867250.73246: dumping result to json 15247 1726867250.73268: done dumping result, returning 15247 1726867250.73287: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-8ce3-1923-0000000002b5] 15247 1726867250.73290: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000002b5 15247 1726867250.73798: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000002b5 15247 1726867250.73801: WORKER PROCESS EXITING ok: [managed_node2] 15247 1726867250.74013: no more pending results, returning what we have 15247 1726867250.74016: results queue empty 15247 1726867250.74016: checking for any_errors_fatal 15247 1726867250.74017: done checking for any_errors_fatal 15247 1726867250.74018: checking for max_fail_percentage 15247 1726867250.74019: done checking for max_fail_percentage 15247 1726867250.74020: checking to see if all hosts have failed and the running result is not ok 15247 1726867250.74021: done checking to see if all hosts have failed 15247 1726867250.74022: getting the remaining hosts for this loop 15247 1726867250.74023: done getting the remaining hosts for this loop 15247 1726867250.74025: getting the next task for host managed_node2 15247 1726867250.74028: done getting next task for host managed_node2 15247 1726867250.74030: ^ task is: TASK: meta (flush_handlers) 15247 1726867250.74031: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867250.74035: getting variables 15247 1726867250.74035: in VariableManager get_vars() 15247 1726867250.74076: Calling all_inventory to load vars for managed_node2 15247 1726867250.74095: Calling groups_inventory to load vars for managed_node2 15247 1726867250.74098: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867250.74117: Calling all_plugins_play to load vars for managed_node2 15247 1726867250.74123: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867250.74129: Calling groups_plugins_play to load vars for managed_node2 15247 1726867250.75191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867250.76271: done with get_vars() 15247 1726867250.76293: done getting variables 15247 1726867250.76341: in VariableManager get_vars() 15247 1726867250.76349: Calling all_inventory to load vars for managed_node2 15247 1726867250.76350: Calling groups_inventory to load vars for managed_node2 15247 1726867250.76352: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867250.76362: Calling all_plugins_play to load vars for managed_node2 15247 1726867250.76366: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867250.76373: Calling groups_plugins_play to load vars for managed_node2 15247 1726867250.81580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867250.82728: done with get_vars() 15247 1726867250.82754: done queuing things up, now waiting for results queue to drain 15247 1726867250.82756: results queue empty 15247 1726867250.82757: checking for any_errors_fatal 15247 1726867250.82760: done checking for any_errors_fatal 15247 1726867250.82761: checking for max_fail_percentage 15247 1726867250.82762: done checking for max_fail_percentage 15247 1726867250.82762: checking to see if all hosts have failed and the running result is not ok 15247 1726867250.82767: done checking to see if all hosts have failed 15247 1726867250.82768: getting the remaining hosts for this loop 15247 1726867250.82768: done getting the remaining hosts for this loop 15247 1726867250.82771: getting the next task for host managed_node2 15247 1726867250.82774: done getting next task for host managed_node2 15247 1726867250.82776: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15247 1726867250.82779: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867250.82786: getting variables 15247 1726867250.82787: in VariableManager get_vars() 15247 1726867250.82795: Calling all_inventory to load vars for managed_node2 15247 1726867250.82797: Calling groups_inventory to load vars for managed_node2 15247 1726867250.82798: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867250.82801: Calling all_plugins_play to load vars for managed_node2 15247 1726867250.82802: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867250.82804: Calling groups_plugins_play to load vars for managed_node2 15247 1726867250.83633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867250.85028: done with get_vars() 15247 1726867250.85052: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:20:50 -0400 (0:00:01.047) 0:00:20.561 ****** 15247 1726867250.85144: entering _queue_task() for managed_node2/include_tasks 15247 1726867250.85548: worker is 1 (out of 1 available) 15247 1726867250.85560: exiting _queue_task() for managed_node2/include_tasks 15247 1726867250.85574: done queuing things up, now waiting for results queue to drain 15247 1726867250.85575: waiting for pending results... 15247 1726867250.85861: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15247 1726867250.85938: in run() - task 0affcac9-a3a5-8ce3-1923-00000000003a 15247 1726867250.85949: variable 'ansible_search_path' from source: unknown 15247 1726867250.85953: variable 'ansible_search_path' from source: unknown 15247 1726867250.85985: calling self._execute() 15247 1726867250.86082: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867250.86088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867250.86091: variable 'omit' from source: magic vars 15247 1726867250.86590: variable 'ansible_distribution_major_version' from source: facts 15247 1726867250.86593: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867250.86596: _execute() done 15247 1726867250.86598: dumping result to json 15247 1726867250.86599: done dumping result, returning 15247 1726867250.86601: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-8ce3-1923-00000000003a] 15247 1726867250.86603: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000003a 15247 1726867250.86673: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000003a 15247 1726867250.86675: WORKER PROCESS EXITING 15247 1726867250.86722: no more pending results, returning what we have 15247 1726867250.86727: in VariableManager get_vars() 15247 1726867250.86765: Calling all_inventory to load vars for managed_node2 15247 1726867250.86768: Calling groups_inventory to load vars for managed_node2 15247 1726867250.86770: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867250.86800: Calling all_plugins_play to load vars for managed_node2 15247 1726867250.86803: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867250.86809: Calling groups_plugins_play to load vars for managed_node2 15247 1726867250.88597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867250.89714: done with get_vars() 15247 1726867250.89727: variable 'ansible_search_path' from source: unknown 15247 1726867250.89728: variable 'ansible_search_path' from source: unknown 15247 1726867250.89748: we have included files to process 15247 1726867250.89748: generating all_blocks data 15247 1726867250.89749: done generating all_blocks data 15247 1726867250.89750: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15247 1726867250.89750: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15247 1726867250.89755: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15247 1726867250.90308: done processing included file 15247 1726867250.90309: iterating over new_blocks loaded from include file 15247 1726867250.90310: in VariableManager get_vars() 15247 1726867250.90331: done with get_vars() 15247 1726867250.90333: filtering new block on tags 15247 1726867250.90350: done filtering new block on tags 15247 1726867250.90358: in VariableManager get_vars() 15247 1726867250.90387: done with get_vars() 15247 1726867250.90389: filtering new block on tags 15247 1726867250.90405: done filtering new block on tags 15247 1726867250.90410: in VariableManager get_vars() 15247 1726867250.90429: done with get_vars() 15247 1726867250.90431: filtering new block on tags 15247 1726867250.90447: done filtering new block on tags 15247 1726867250.90449: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 15247 1726867250.90455: extending task lists for all hosts with included blocks 15247 1726867250.91055: done extending task lists 15247 1726867250.91056: done processing included files 15247 1726867250.91057: results queue empty 15247 1726867250.91058: checking for any_errors_fatal 15247 1726867250.91060: done checking for any_errors_fatal 15247 1726867250.91060: checking for max_fail_percentage 15247 1726867250.91062: done checking for max_fail_percentage 15247 1726867250.91062: checking to see if all hosts have failed and the running result is not ok 15247 1726867250.91063: done checking to see if all hosts have failed 15247 1726867250.91064: getting the remaining hosts for this loop 15247 1726867250.91065: done getting the remaining hosts for this loop 15247 1726867250.91067: getting the next task for host managed_node2 15247 1726867250.91071: done getting next task for host managed_node2 15247 1726867250.91074: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15247 1726867250.91076: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867250.91088: getting variables 15247 1726867250.91089: in VariableManager get_vars() 15247 1726867250.91102: Calling all_inventory to load vars for managed_node2 15247 1726867250.91104: Calling groups_inventory to load vars for managed_node2 15247 1726867250.91108: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867250.91112: Calling all_plugins_play to load vars for managed_node2 15247 1726867250.91116: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867250.91119: Calling groups_plugins_play to load vars for managed_node2 15247 1726867250.92254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867250.93949: done with get_vars() 15247 1726867250.93973: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:20:50 -0400 (0:00:00.089) 0:00:20.650 ****** 15247 1726867250.94064: entering _queue_task() for managed_node2/setup 15247 1726867250.94438: worker is 1 (out of 1 available) 15247 1726867250.94456: exiting _queue_task() for managed_node2/setup 15247 1726867250.94475: done queuing things up, now waiting for results queue to drain 15247 1726867250.94480: waiting for pending results... 15247 1726867250.94825: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15247 1726867250.95030: in run() - task 0affcac9-a3a5-8ce3-1923-0000000002f6 15247 1726867250.95034: variable 'ansible_search_path' from source: unknown 15247 1726867250.95038: variable 'ansible_search_path' from source: unknown 15247 1726867250.95041: calling self._execute() 15247 1726867250.95213: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867250.95217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867250.95219: variable 'omit' from source: magic vars 15247 1726867250.95500: variable 'ansible_distribution_major_version' from source: facts 15247 1726867250.95512: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867250.95729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867250.97864: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867250.97867: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867250.97901: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867250.98050: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867250.98147: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867250.98300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867250.98331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867250.98355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867250.98592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867250.98610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867250.98660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867250.98720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867250.98735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867250.98774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867250.98832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867250.99070: variable '__network_required_facts' from source: role '' defaults 15247 1726867250.99081: variable 'ansible_facts' from source: unknown 15247 1726867251.00600: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15247 1726867251.00605: when evaluation is False, skipping this task 15247 1726867251.00610: _execute() done 15247 1726867251.00613: dumping result to json 15247 1726867251.00615: done dumping result, returning 15247 1726867251.00619: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-8ce3-1923-0000000002f6] 15247 1726867251.00625: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000002f6 15247 1726867251.00827: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000002f6 15247 1726867251.00830: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15247 1726867251.00892: no more pending results, returning what we have 15247 1726867251.00896: results queue empty 15247 1726867251.00897: checking for any_errors_fatal 15247 1726867251.00899: done checking for any_errors_fatal 15247 1726867251.00900: checking for max_fail_percentage 15247 1726867251.00901: done checking for max_fail_percentage 15247 1726867251.00902: checking to see if all hosts have failed and the running result is not ok 15247 1726867251.00903: done checking to see if all hosts have failed 15247 1726867251.00904: getting the remaining hosts for this loop 15247 1726867251.00905: done getting the remaining hosts for this loop 15247 1726867251.00909: getting the next task for host managed_node2 15247 1726867251.00918: done getting next task for host managed_node2 15247 1726867251.00922: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15247 1726867251.00925: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867251.00938: getting variables 15247 1726867251.00940: in VariableManager get_vars() 15247 1726867251.00980: Calling all_inventory to load vars for managed_node2 15247 1726867251.00982: Calling groups_inventory to load vars for managed_node2 15247 1726867251.00985: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867251.00995: Calling all_plugins_play to load vars for managed_node2 15247 1726867251.00998: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867251.01001: Calling groups_plugins_play to load vars for managed_node2 15247 1726867251.03850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867251.07026: done with get_vars() 15247 1726867251.07049: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:20:51 -0400 (0:00:00.130) 0:00:20.781 ****** 15247 1726867251.07145: entering _queue_task() for managed_node2/stat 15247 1726867251.07912: worker is 1 (out of 1 available) 15247 1726867251.07922: exiting _queue_task() for managed_node2/stat 15247 1726867251.07933: done queuing things up, now waiting for results queue to drain 15247 1726867251.07934: waiting for pending results... 15247 1726867251.08301: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 15247 1726867251.08633: in run() - task 0affcac9-a3a5-8ce3-1923-0000000002f8 15247 1726867251.08637: variable 'ansible_search_path' from source: unknown 15247 1726867251.08640: variable 'ansible_search_path' from source: unknown 15247 1726867251.08643: calling self._execute() 15247 1726867251.08893: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867251.08909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867251.08927: variable 'omit' from source: magic vars 15247 1726867251.09644: variable 'ansible_distribution_major_version' from source: facts 15247 1726867251.09939: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867251.10009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867251.10483: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867251.10647: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867251.10743: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867251.10854: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867251.11100: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867251.11170: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867251.11283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867251.11320: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867251.11526: variable '__network_is_ostree' from source: set_fact 15247 1726867251.11538: Evaluated conditional (not __network_is_ostree is defined): False 15247 1726867251.11545: when evaluation is False, skipping this task 15247 1726867251.11551: _execute() done 15247 1726867251.11558: dumping result to json 15247 1726867251.11564: done dumping result, returning 15247 1726867251.11587: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-8ce3-1923-0000000002f8] 15247 1726867251.11782: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000002f8 15247 1726867251.11853: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000002f8 15247 1726867251.11856: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15247 1726867251.11907: no more pending results, returning what we have 15247 1726867251.11911: results queue empty 15247 1726867251.11912: checking for any_errors_fatal 15247 1726867251.11918: done checking for any_errors_fatal 15247 1726867251.11919: checking for max_fail_percentage 15247 1726867251.11921: done checking for max_fail_percentage 15247 1726867251.11923: checking to see if all hosts have failed and the running result is not ok 15247 1726867251.11924: done checking to see if all hosts have failed 15247 1726867251.11924: getting the remaining hosts for this loop 15247 1726867251.11926: done getting the remaining hosts for this loop 15247 1726867251.11931: getting the next task for host managed_node2 15247 1726867251.11937: done getting next task for host managed_node2 15247 1726867251.11940: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15247 1726867251.11943: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867251.11957: getting variables 15247 1726867251.11959: in VariableManager get_vars() 15247 1726867251.11996: Calling all_inventory to load vars for managed_node2 15247 1726867251.11998: Calling groups_inventory to load vars for managed_node2 15247 1726867251.12000: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867251.12010: Calling all_plugins_play to load vars for managed_node2 15247 1726867251.12013: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867251.12015: Calling groups_plugins_play to load vars for managed_node2 15247 1726867251.14869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867251.16633: done with get_vars() 15247 1726867251.16658: done getting variables 15247 1726867251.16724: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:20:51 -0400 (0:00:00.096) 0:00:20.877 ****** 15247 1726867251.16759: entering _queue_task() for managed_node2/set_fact 15247 1726867251.17283: worker is 1 (out of 1 available) 15247 1726867251.17295: exiting _queue_task() for managed_node2/set_fact 15247 1726867251.17307: done queuing things up, now waiting for results queue to drain 15247 1726867251.17308: waiting for pending results... 15247 1726867251.17595: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15247 1726867251.17686: in run() - task 0affcac9-a3a5-8ce3-1923-0000000002f9 15247 1726867251.17691: variable 'ansible_search_path' from source: unknown 15247 1726867251.17694: variable 'ansible_search_path' from source: unknown 15247 1726867251.17697: calling self._execute() 15247 1726867251.17742: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867251.17748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867251.17763: variable 'omit' from source: magic vars 15247 1726867251.18183: variable 'ansible_distribution_major_version' from source: facts 15247 1726867251.18187: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867251.18383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867251.18667: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867251.18671: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867251.18674: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867251.18708: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867251.18823: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867251.18853: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867251.18880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867251.18912: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867251.19082: variable '__network_is_ostree' from source: set_fact 15247 1726867251.19085: Evaluated conditional (not __network_is_ostree is defined): False 15247 1726867251.19087: when evaluation is False, skipping this task 15247 1726867251.19089: _execute() done 15247 1726867251.19090: dumping result to json 15247 1726867251.19092: done dumping result, returning 15247 1726867251.19094: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-8ce3-1923-0000000002f9] 15247 1726867251.19096: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000002f9 15247 1726867251.19157: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000002f9 15247 1726867251.19161: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15247 1726867251.19209: no more pending results, returning what we have 15247 1726867251.19212: results queue empty 15247 1726867251.19214: checking for any_errors_fatal 15247 1726867251.19222: done checking for any_errors_fatal 15247 1726867251.19223: checking for max_fail_percentage 15247 1726867251.19225: done checking for max_fail_percentage 15247 1726867251.19226: checking to see if all hosts have failed and the running result is not ok 15247 1726867251.19227: done checking to see if all hosts have failed 15247 1726867251.19228: getting the remaining hosts for this loop 15247 1726867251.19229: done getting the remaining hosts for this loop 15247 1726867251.19233: getting the next task for host managed_node2 15247 1726867251.19244: done getting next task for host managed_node2 15247 1726867251.19248: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15247 1726867251.19252: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867251.19269: getting variables 15247 1726867251.19272: in VariableManager get_vars() 15247 1726867251.19312: Calling all_inventory to load vars for managed_node2 15247 1726867251.19315: Calling groups_inventory to load vars for managed_node2 15247 1726867251.19317: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867251.19328: Calling all_plugins_play to load vars for managed_node2 15247 1726867251.19331: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867251.19333: Calling groups_plugins_play to load vars for managed_node2 15247 1726867251.21957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867251.25153: done with get_vars() 15247 1726867251.25382: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:20:51 -0400 (0:00:00.087) 0:00:20.964 ****** 15247 1726867251.25474: entering _queue_task() for managed_node2/service_facts 15247 1726867251.26211: worker is 1 (out of 1 available) 15247 1726867251.26220: exiting _queue_task() for managed_node2/service_facts 15247 1726867251.26230: done queuing things up, now waiting for results queue to drain 15247 1726867251.26231: waiting for pending results... 15247 1726867251.26797: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 15247 1726867251.26857: in run() - task 0affcac9-a3a5-8ce3-1923-0000000002fb 15247 1726867251.27135: variable 'ansible_search_path' from source: unknown 15247 1726867251.27138: variable 'ansible_search_path' from source: unknown 15247 1726867251.27141: calling self._execute() 15247 1726867251.27202: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867251.27257: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867251.27461: variable 'omit' from source: magic vars 15247 1726867251.28071: variable 'ansible_distribution_major_version' from source: facts 15247 1726867251.28125: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867251.28140: variable 'omit' from source: magic vars 15247 1726867251.28281: variable 'omit' from source: magic vars 15247 1726867251.28325: variable 'omit' from source: magic vars 15247 1726867251.28551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867251.28555: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867251.28557: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867251.28672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867251.28691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867251.28728: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867251.28737: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867251.28774: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867251.28993: Set connection var ansible_shell_executable to /bin/sh 15247 1726867251.29000: Set connection var ansible_connection to ssh 15247 1726867251.29005: Set connection var ansible_shell_type to sh 15247 1726867251.29016: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867251.29025: Set connection var ansible_timeout to 10 15247 1726867251.29311: Set connection var ansible_pipelining to False 15247 1726867251.29316: variable 'ansible_shell_executable' from source: unknown 15247 1726867251.29318: variable 'ansible_connection' from source: unknown 15247 1726867251.29320: variable 'ansible_module_compression' from source: unknown 15247 1726867251.29321: variable 'ansible_shell_type' from source: unknown 15247 1726867251.29323: variable 'ansible_shell_executable' from source: unknown 15247 1726867251.29324: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867251.29326: variable 'ansible_pipelining' from source: unknown 15247 1726867251.29328: variable 'ansible_timeout' from source: unknown 15247 1726867251.29329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867251.29579: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867251.29596: variable 'omit' from source: magic vars 15247 1726867251.29609: starting attempt loop 15247 1726867251.29616: running the handler 15247 1726867251.29782: _low_level_execute_command(): starting 15247 1726867251.29785: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867251.31282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867251.31487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867251.31735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867251.33586: stdout chunk (state=3): >>>/root <<< 15247 1726867251.33589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867251.33592: stdout chunk (state=3): >>><<< 15247 1726867251.33595: stderr chunk (state=3): >>><<< 15247 1726867251.33618: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867251.33639: _low_level_execute_command(): starting 15247 1726867251.33781: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867251.3362575-16271-203000563186230 `" && echo ansible-tmp-1726867251.3362575-16271-203000563186230="` echo /root/.ansible/tmp/ansible-tmp-1726867251.3362575-16271-203000563186230 `" ) && sleep 0' 15247 1726867251.34897: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867251.35094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867251.35147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867251.35280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867251.35319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867251.37271: stdout chunk (state=3): >>>ansible-tmp-1726867251.3362575-16271-203000563186230=/root/.ansible/tmp/ansible-tmp-1726867251.3362575-16271-203000563186230 <<< 15247 1726867251.37562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867251.37570: stdout chunk (state=3): >>><<< 15247 1726867251.37583: stderr chunk (state=3): >>><<< 15247 1726867251.37652: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867251.3362575-16271-203000563186230=/root/.ansible/tmp/ansible-tmp-1726867251.3362575-16271-203000563186230 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867251.38082: variable 'ansible_module_compression' from source: unknown 15247 1726867251.38085: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15247 1726867251.38285: variable 'ansible_facts' from source: unknown 15247 1726867251.38389: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867251.3362575-16271-203000563186230/AnsiballZ_service_facts.py 15247 1726867251.38848: Sending initial data 15247 1726867251.38852: Sent initial data (162 bytes) 15247 1726867251.40319: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867251.40460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867251.40619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867251.40658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867251.42283: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867251.42400: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867251.42465: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpyjta7y1h /root/.ansible/tmp/ansible-tmp-1726867251.3362575-16271-203000563186230/AnsiballZ_service_facts.py <<< 15247 1726867251.42490: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867251.3362575-16271-203000563186230/AnsiballZ_service_facts.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpyjta7y1h" to remote "/root/.ansible/tmp/ansible-tmp-1726867251.3362575-16271-203000563186230/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867251.3362575-16271-203000563186230/AnsiballZ_service_facts.py" <<< 15247 1726867251.43987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867251.43991: stdout chunk (state=3): >>><<< 15247 1726867251.43993: stderr chunk (state=3): >>><<< 15247 1726867251.43995: done transferring module to remote 15247 1726867251.43997: _low_level_execute_command(): starting 15247 1726867251.43999: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867251.3362575-16271-203000563186230/ /root/.ansible/tmp/ansible-tmp-1726867251.3362575-16271-203000563186230/AnsiballZ_service_facts.py && sleep 0' 15247 1726867251.45184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867251.45193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867251.45203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867251.45218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867251.45243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867251.45268: stderr chunk (state=3): >>>debug2: match not found <<< 15247 1726867251.45380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867251.45467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867251.45570: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867251.45683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867251.47519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867251.47528: stdout chunk (state=3): >>><<< 15247 1726867251.47530: stderr chunk (state=3): >>><<< 15247 1726867251.47684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867251.47688: _low_level_execute_command(): starting 15247 1726867251.47691: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867251.3362575-16271-203000563186230/AnsiballZ_service_facts.py && sleep 0' 15247 1726867251.48100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867251.48111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867251.48119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867251.48133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867251.48145: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867251.48152: stderr chunk (state=3): >>>debug2: match not found <<< 15247 1726867251.48161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867251.48174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15247 1726867251.48185: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 15247 1726867251.48193: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15247 1726867251.48201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867251.48211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867251.48222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867251.48229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867251.48241: stderr chunk (state=3): >>>debug2: match found <<< 15247 1726867251.48248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867251.48312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867251.48358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867251.48362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867251.48521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867253.05499: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 15247 1726867253.05505: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 15247 1726867253.05566: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15247 1726867253.07009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867253.07013: stdout chunk (state=3): >>><<< 15247 1726867253.07019: stderr chunk (state=3): >>><<< 15247 1726867253.07050: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867253.08131: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867251.3362575-16271-203000563186230/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867253.08147: _low_level_execute_command(): starting 15247 1726867253.08158: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867251.3362575-16271-203000563186230/ > /dev/null 2>&1 && sleep 0' 15247 1726867253.09293: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867253.09368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867253.09493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867253.09563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867253.11448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867253.11474: stderr chunk (state=3): >>><<< 15247 1726867253.11563: stdout chunk (state=3): >>><<< 15247 1726867253.11579: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867253.11590: handler run complete 15247 1726867253.12094: variable 'ansible_facts' from source: unknown 15247 1726867253.12275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867253.13273: variable 'ansible_facts' from source: unknown 15247 1726867253.13612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867253.14088: attempt loop complete, returning result 15247 1726867253.14107: _execute() done 15247 1726867253.14115: dumping result to json 15247 1726867253.14213: done dumping result, returning 15247 1726867253.14322: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-8ce3-1923-0000000002fb] 15247 1726867253.14325: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000002fb ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15247 1726867253.16665: no more pending results, returning what we have 15247 1726867253.16668: results queue empty 15247 1726867253.16669: checking for any_errors_fatal 15247 1726867253.16672: done checking for any_errors_fatal 15247 1726867253.16673: checking for max_fail_percentage 15247 1726867253.16674: done checking for max_fail_percentage 15247 1726867253.16675: checking to see if all hosts have failed and the running result is not ok 15247 1726867253.16676: done checking to see if all hosts have failed 15247 1726867253.16681: getting the remaining hosts for this loop 15247 1726867253.16683: done getting the remaining hosts for this loop 15247 1726867253.16686: getting the next task for host managed_node2 15247 1726867253.16691: done getting next task for host managed_node2 15247 1726867253.16694: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15247 1726867253.16696: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867253.16705: getting variables 15247 1726867253.16707: in VariableManager get_vars() 15247 1726867253.16733: Calling all_inventory to load vars for managed_node2 15247 1726867253.16736: Calling groups_inventory to load vars for managed_node2 15247 1726867253.16738: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867253.16746: Calling all_plugins_play to load vars for managed_node2 15247 1726867253.16749: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867253.16751: Calling groups_plugins_play to load vars for managed_node2 15247 1726867253.17484: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000002fb 15247 1726867253.17488: WORKER PROCESS EXITING 15247 1726867253.19168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867253.22438: done with get_vars() 15247 1726867253.22575: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:20:53 -0400 (0:00:01.971) 0:00:22.936 ****** 15247 1726867253.22667: entering _queue_task() for managed_node2/package_facts 15247 1726867253.23310: worker is 1 (out of 1 available) 15247 1726867253.23436: exiting _queue_task() for managed_node2/package_facts 15247 1726867253.23449: done queuing things up, now waiting for results queue to drain 15247 1726867253.23450: waiting for pending results... 15247 1726867253.23944: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 15247 1726867253.24055: in run() - task 0affcac9-a3a5-8ce3-1923-0000000002fc 15247 1726867253.24069: variable 'ansible_search_path' from source: unknown 15247 1726867253.24073: variable 'ansible_search_path' from source: unknown 15247 1726867253.24309: calling self._execute() 15247 1726867253.24400: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867253.24413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867253.24424: variable 'omit' from source: magic vars 15247 1726867253.25200: variable 'ansible_distribution_major_version' from source: facts 15247 1726867253.25213: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867253.25220: variable 'omit' from source: magic vars 15247 1726867253.25278: variable 'omit' from source: magic vars 15247 1726867253.25519: variable 'omit' from source: magic vars 15247 1726867253.25558: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867253.25594: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867253.25617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867253.25635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867253.25647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867253.25680: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867253.25887: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867253.25891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867253.25993: Set connection var ansible_shell_executable to /bin/sh 15247 1726867253.25997: Set connection var ansible_connection to ssh 15247 1726867253.26000: Set connection var ansible_shell_type to sh 15247 1726867253.26003: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867253.26014: Set connection var ansible_timeout to 10 15247 1726867253.26020: Set connection var ansible_pipelining to False 15247 1726867253.26047: variable 'ansible_shell_executable' from source: unknown 15247 1726867253.26050: variable 'ansible_connection' from source: unknown 15247 1726867253.26166: variable 'ansible_module_compression' from source: unknown 15247 1726867253.26171: variable 'ansible_shell_type' from source: unknown 15247 1726867253.26174: variable 'ansible_shell_executable' from source: unknown 15247 1726867253.26176: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867253.26179: variable 'ansible_pipelining' from source: unknown 15247 1726867253.26182: variable 'ansible_timeout' from source: unknown 15247 1726867253.26184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867253.26457: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867253.26467: variable 'omit' from source: magic vars 15247 1726867253.26473: starting attempt loop 15247 1726867253.26478: running the handler 15247 1726867253.26736: _low_level_execute_command(): starting 15247 1726867253.26739: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867253.28101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867253.28105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867253.28189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867253.28238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867253.29922: stdout chunk (state=3): >>>/root <<< 15247 1726867253.30083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867253.30086: stdout chunk (state=3): >>><<< 15247 1726867253.30089: stderr chunk (state=3): >>><<< 15247 1726867253.30107: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867253.30208: _low_level_execute_command(): starting 15247 1726867253.30213: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867253.3011408-16339-280189666586520 `" && echo ansible-tmp-1726867253.3011408-16339-280189666586520="` echo /root/.ansible/tmp/ansible-tmp-1726867253.3011408-16339-280189666586520 `" ) && sleep 0' 15247 1726867253.30741: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867253.30754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867253.30768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867253.30787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867253.30833: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867253.30913: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867253.30947: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867253.31009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867253.33083: stdout chunk (state=3): >>>ansible-tmp-1726867253.3011408-16339-280189666586520=/root/.ansible/tmp/ansible-tmp-1726867253.3011408-16339-280189666586520 <<< 15247 1726867253.33086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867253.33089: stdout chunk (state=3): >>><<< 15247 1726867253.33104: stderr chunk (state=3): >>><<< 15247 1726867253.33375: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867253.3011408-16339-280189666586520=/root/.ansible/tmp/ansible-tmp-1726867253.3011408-16339-280189666586520 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867253.33381: variable 'ansible_module_compression' from source: unknown 15247 1726867253.33493: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15247 1726867253.33522: variable 'ansible_facts' from source: unknown 15247 1726867253.33841: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867253.3011408-16339-280189666586520/AnsiballZ_package_facts.py 15247 1726867253.33997: Sending initial data 15247 1726867253.34008: Sent initial data (162 bytes) 15247 1726867253.34613: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867253.34699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867253.34737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867253.34758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867253.34792: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867253.34833: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867253.36511: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867253.36527: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867253.36574: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp7645qnem /root/.ansible/tmp/ansible-tmp-1726867253.3011408-16339-280189666586520/AnsiballZ_package_facts.py <<< 15247 1726867253.36580: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867253.3011408-16339-280189666586520/AnsiballZ_package_facts.py" <<< 15247 1726867253.36627: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp7645qnem" to remote "/root/.ansible/tmp/ansible-tmp-1726867253.3011408-16339-280189666586520/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867253.3011408-16339-280189666586520/AnsiballZ_package_facts.py" <<< 15247 1726867253.38286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867253.38291: stdout chunk (state=3): >>><<< 15247 1726867253.38293: stderr chunk (state=3): >>><<< 15247 1726867253.38296: done transferring module to remote 15247 1726867253.38298: _low_level_execute_command(): starting 15247 1726867253.38300: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867253.3011408-16339-280189666586520/ /root/.ansible/tmp/ansible-tmp-1726867253.3011408-16339-280189666586520/AnsiballZ_package_facts.py && sleep 0' 15247 1726867253.39048: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867253.39063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 15247 1726867253.39144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867253.39158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867253.39198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867253.39259: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867253.39299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867253.39395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867253.41314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867253.41318: stdout chunk (state=3): >>><<< 15247 1726867253.41320: stderr chunk (state=3): >>><<< 15247 1726867253.41454: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867253.41461: _low_level_execute_command(): starting 15247 1726867253.41465: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867253.3011408-16339-280189666586520/AnsiballZ_package_facts.py && sleep 0' 15247 1726867253.42208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867253.42212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867253.42215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867253.42217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867253.42219: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867253.42221: stderr chunk (state=3): >>>debug2: match not found <<< 15247 1726867253.42223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867253.42225: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15247 1726867253.42231: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867253.42233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867253.42335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867253.86967: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 15247 1726867253.87035: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 15247 1726867253.87054: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 15247 1726867253.87103: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 15247 1726867253.87116: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 15247 1726867253.87122: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 15247 1726867253.87159: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15247 1726867253.88983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867253.88986: stdout chunk (state=3): >>><<< 15247 1726867253.88989: stderr chunk (state=3): >>><<< 15247 1726867253.89040: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867253.91361: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867253.3011408-16339-280189666586520/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867253.91365: _low_level_execute_command(): starting 15247 1726867253.91367: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867253.3011408-16339-280189666586520/ > /dev/null 2>&1 && sleep 0' 15247 1726867253.92264: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867253.92269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867253.92272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867253.92345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867253.92348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867253.92355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867253.92381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867253.92431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867253.94273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867253.94302: stderr chunk (state=3): >>><<< 15247 1726867253.94306: stdout chunk (state=3): >>><<< 15247 1726867253.94323: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867253.94329: handler run complete 15247 1726867253.94792: variable 'ansible_facts' from source: unknown 15247 1726867253.95123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867253.96164: variable 'ansible_facts' from source: unknown 15247 1726867253.96607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867253.97265: attempt loop complete, returning result 15247 1726867253.97274: _execute() done 15247 1726867253.97279: dumping result to json 15247 1726867253.97474: done dumping result, returning 15247 1726867253.97479: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-8ce3-1923-0000000002fc] 15247 1726867253.97485: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000002fc ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15247 1726867253.99409: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000002fc 15247 1726867253.99419: no more pending results, returning what we have 15247 1726867253.99422: results queue empty 15247 1726867253.99422: checking for any_errors_fatal 15247 1726867253.99427: done checking for any_errors_fatal 15247 1726867253.99427: checking for max_fail_percentage 15247 1726867253.99428: done checking for max_fail_percentage 15247 1726867253.99429: checking to see if all hosts have failed and the running result is not ok 15247 1726867253.99429: done checking to see if all hosts have failed 15247 1726867253.99430: getting the remaining hosts for this loop 15247 1726867253.99432: done getting the remaining hosts for this loop 15247 1726867253.99435: getting the next task for host managed_node2 15247 1726867253.99440: done getting next task for host managed_node2 15247 1726867253.99442: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15247 1726867253.99444: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867253.99451: WORKER PROCESS EXITING 15247 1726867253.99456: getting variables 15247 1726867253.99457: in VariableManager get_vars() 15247 1726867253.99481: Calling all_inventory to load vars for managed_node2 15247 1726867253.99483: Calling groups_inventory to load vars for managed_node2 15247 1726867253.99484: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867253.99491: Calling all_plugins_play to load vars for managed_node2 15247 1726867253.99493: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867253.99494: Calling groups_plugins_play to load vars for managed_node2 15247 1726867254.00259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867254.01273: done with get_vars() 15247 1726867254.01292: done getting variables 15247 1726867254.01337: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:20:54 -0400 (0:00:00.786) 0:00:23.723 ****** 15247 1726867254.01359: entering _queue_task() for managed_node2/debug 15247 1726867254.01608: worker is 1 (out of 1 available) 15247 1726867254.01622: exiting _queue_task() for managed_node2/debug 15247 1726867254.01634: done queuing things up, now waiting for results queue to drain 15247 1726867254.01635: waiting for pending results... 15247 1726867254.01883: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 15247 1726867254.01944: in run() - task 0affcac9-a3a5-8ce3-1923-00000000003b 15247 1726867254.01956: variable 'ansible_search_path' from source: unknown 15247 1726867254.01960: variable 'ansible_search_path' from source: unknown 15247 1726867254.02009: calling self._execute() 15247 1726867254.02142: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867254.02148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867254.02152: variable 'omit' from source: magic vars 15247 1726867254.02447: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.02450: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867254.02463: variable 'omit' from source: magic vars 15247 1726867254.02492: variable 'omit' from source: magic vars 15247 1726867254.02562: variable 'network_provider' from source: set_fact 15247 1726867254.02576: variable 'omit' from source: magic vars 15247 1726867254.02611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867254.02640: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867254.02671: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867254.02681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867254.02716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867254.02749: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867254.02752: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867254.02755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867254.02844: Set connection var ansible_shell_executable to /bin/sh 15247 1726867254.02847: Set connection var ansible_connection to ssh 15247 1726867254.02850: Set connection var ansible_shell_type to sh 15247 1726867254.02853: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867254.02863: Set connection var ansible_timeout to 10 15247 1726867254.02868: Set connection var ansible_pipelining to False 15247 1726867254.02897: variable 'ansible_shell_executable' from source: unknown 15247 1726867254.02901: variable 'ansible_connection' from source: unknown 15247 1726867254.02903: variable 'ansible_module_compression' from source: unknown 15247 1726867254.02906: variable 'ansible_shell_type' from source: unknown 15247 1726867254.02910: variable 'ansible_shell_executable' from source: unknown 15247 1726867254.02912: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867254.02915: variable 'ansible_pipelining' from source: unknown 15247 1726867254.02924: variable 'ansible_timeout' from source: unknown 15247 1726867254.02928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867254.03084: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867254.03103: variable 'omit' from source: magic vars 15247 1726867254.03106: starting attempt loop 15247 1726867254.03113: running the handler 15247 1726867254.03162: handler run complete 15247 1726867254.03169: attempt loop complete, returning result 15247 1726867254.03172: _execute() done 15247 1726867254.03174: dumping result to json 15247 1726867254.03205: done dumping result, returning 15247 1726867254.03207: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-8ce3-1923-00000000003b] 15247 1726867254.03210: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000003b 15247 1726867254.03303: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000003b 15247 1726867254.03306: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: Using network provider: nm 15247 1726867254.03366: no more pending results, returning what we have 15247 1726867254.03369: results queue empty 15247 1726867254.03370: checking for any_errors_fatal 15247 1726867254.03382: done checking for any_errors_fatal 15247 1726867254.03383: checking for max_fail_percentage 15247 1726867254.03385: done checking for max_fail_percentage 15247 1726867254.03386: checking to see if all hosts have failed and the running result is not ok 15247 1726867254.03387: done checking to see if all hosts have failed 15247 1726867254.03387: getting the remaining hosts for this loop 15247 1726867254.03389: done getting the remaining hosts for this loop 15247 1726867254.03392: getting the next task for host managed_node2 15247 1726867254.03400: done getting next task for host managed_node2 15247 1726867254.03404: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15247 1726867254.03405: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867254.03416: getting variables 15247 1726867254.03418: in VariableManager get_vars() 15247 1726867254.03453: Calling all_inventory to load vars for managed_node2 15247 1726867254.03457: Calling groups_inventory to load vars for managed_node2 15247 1726867254.03459: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867254.03470: Calling all_plugins_play to load vars for managed_node2 15247 1726867254.03472: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867254.03475: Calling groups_plugins_play to load vars for managed_node2 15247 1726867254.04738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867254.06026: done with get_vars() 15247 1726867254.06051: done getting variables 15247 1726867254.06126: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:20:54 -0400 (0:00:00.047) 0:00:23.771 ****** 15247 1726867254.06153: entering _queue_task() for managed_node2/fail 15247 1726867254.06427: worker is 1 (out of 1 available) 15247 1726867254.06438: exiting _queue_task() for managed_node2/fail 15247 1726867254.06454: done queuing things up, now waiting for results queue to drain 15247 1726867254.06459: waiting for pending results... 15247 1726867254.06758: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15247 1726867254.06831: in run() - task 0affcac9-a3a5-8ce3-1923-00000000003c 15247 1726867254.06840: variable 'ansible_search_path' from source: unknown 15247 1726867254.06866: variable 'ansible_search_path' from source: unknown 15247 1726867254.06887: calling self._execute() 15247 1726867254.06981: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867254.06985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867254.06994: variable 'omit' from source: magic vars 15247 1726867254.07268: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.07278: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867254.07359: variable 'network_state' from source: role '' defaults 15247 1726867254.07368: Evaluated conditional (network_state != {}): False 15247 1726867254.07371: when evaluation is False, skipping this task 15247 1726867254.07374: _execute() done 15247 1726867254.07376: dumping result to json 15247 1726867254.07392: done dumping result, returning 15247 1726867254.07395: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-8ce3-1923-00000000003c] 15247 1726867254.07398: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000003c 15247 1726867254.07487: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000003c 15247 1726867254.07489: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15247 1726867254.07538: no more pending results, returning what we have 15247 1726867254.07542: results queue empty 15247 1726867254.07543: checking for any_errors_fatal 15247 1726867254.07551: done checking for any_errors_fatal 15247 1726867254.07552: checking for max_fail_percentage 15247 1726867254.07554: done checking for max_fail_percentage 15247 1726867254.07555: checking to see if all hosts have failed and the running result is not ok 15247 1726867254.07556: done checking to see if all hosts have failed 15247 1726867254.07556: getting the remaining hosts for this loop 15247 1726867254.07558: done getting the remaining hosts for this loop 15247 1726867254.07561: getting the next task for host managed_node2 15247 1726867254.07566: done getting next task for host managed_node2 15247 1726867254.07569: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15247 1726867254.07571: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867254.07586: getting variables 15247 1726867254.07587: in VariableManager get_vars() 15247 1726867254.07617: Calling all_inventory to load vars for managed_node2 15247 1726867254.07619: Calling groups_inventory to load vars for managed_node2 15247 1726867254.07621: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867254.07629: Calling all_plugins_play to load vars for managed_node2 15247 1726867254.07631: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867254.07634: Calling groups_plugins_play to load vars for managed_node2 15247 1726867254.08662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867254.09782: done with get_vars() 15247 1726867254.09804: done getting variables 15247 1726867254.09844: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:20:54 -0400 (0:00:00.037) 0:00:23.808 ****** 15247 1726867254.09870: entering _queue_task() for managed_node2/fail 15247 1726867254.10091: worker is 1 (out of 1 available) 15247 1726867254.10105: exiting _queue_task() for managed_node2/fail 15247 1726867254.10120: done queuing things up, now waiting for results queue to drain 15247 1726867254.10121: waiting for pending results... 15247 1726867254.10359: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15247 1726867254.10485: in run() - task 0affcac9-a3a5-8ce3-1923-00000000003d 15247 1726867254.10490: variable 'ansible_search_path' from source: unknown 15247 1726867254.10496: variable 'ansible_search_path' from source: unknown 15247 1726867254.10515: calling self._execute() 15247 1726867254.10614: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867254.10634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867254.10638: variable 'omit' from source: magic vars 15247 1726867254.10943: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.10949: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867254.11038: variable 'network_state' from source: role '' defaults 15247 1726867254.11042: Evaluated conditional (network_state != {}): False 15247 1726867254.11045: when evaluation is False, skipping this task 15247 1726867254.11048: _execute() done 15247 1726867254.11051: dumping result to json 15247 1726867254.11053: done dumping result, returning 15247 1726867254.11062: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-8ce3-1923-00000000003d] 15247 1726867254.11065: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000003d 15247 1726867254.11154: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000003d 15247 1726867254.11157: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15247 1726867254.11233: no more pending results, returning what we have 15247 1726867254.11236: results queue empty 15247 1726867254.11237: checking for any_errors_fatal 15247 1726867254.11241: done checking for any_errors_fatal 15247 1726867254.11241: checking for max_fail_percentage 15247 1726867254.11243: done checking for max_fail_percentage 15247 1726867254.11244: checking to see if all hosts have failed and the running result is not ok 15247 1726867254.11244: done checking to see if all hosts have failed 15247 1726867254.11245: getting the remaining hosts for this loop 15247 1726867254.11246: done getting the remaining hosts for this loop 15247 1726867254.11252: getting the next task for host managed_node2 15247 1726867254.11256: done getting next task for host managed_node2 15247 1726867254.11259: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15247 1726867254.11261: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867254.11273: getting variables 15247 1726867254.11274: in VariableManager get_vars() 15247 1726867254.11308: Calling all_inventory to load vars for managed_node2 15247 1726867254.11310: Calling groups_inventory to load vars for managed_node2 15247 1726867254.11312: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867254.11320: Calling all_plugins_play to load vars for managed_node2 15247 1726867254.11322: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867254.11328: Calling groups_plugins_play to load vars for managed_node2 15247 1726867254.12231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867254.13159: done with get_vars() 15247 1726867254.13184: done getting variables 15247 1726867254.13230: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:20:54 -0400 (0:00:00.033) 0:00:23.842 ****** 15247 1726867254.13250: entering _queue_task() for managed_node2/fail 15247 1726867254.13468: worker is 1 (out of 1 available) 15247 1726867254.13483: exiting _queue_task() for managed_node2/fail 15247 1726867254.13495: done queuing things up, now waiting for results queue to drain 15247 1726867254.13496: waiting for pending results... 15247 1726867254.13670: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15247 1726867254.13763: in run() - task 0affcac9-a3a5-8ce3-1923-00000000003e 15247 1726867254.13780: variable 'ansible_search_path' from source: unknown 15247 1726867254.13796: variable 'ansible_search_path' from source: unknown 15247 1726867254.13835: calling self._execute() 15247 1726867254.13905: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867254.13912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867254.13929: variable 'omit' from source: magic vars 15247 1726867254.14210: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.14225: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867254.14345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867254.16208: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867254.16299: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867254.16330: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867254.16357: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867254.16396: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867254.16460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.16482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.16515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.16555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.16570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.16666: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.16669: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15247 1726867254.16752: variable 'ansible_distribution' from source: facts 15247 1726867254.16755: variable '__network_rh_distros' from source: role '' defaults 15247 1726867254.16764: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15247 1726867254.16933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.16950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.16975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.16999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.17011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.17069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.17095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.17111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.17155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.17168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.17216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.17235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.17266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.17294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.17313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.17561: variable 'network_connections' from source: play vars 15247 1726867254.17569: variable 'profile' from source: play vars 15247 1726867254.17630: variable 'profile' from source: play vars 15247 1726867254.17633: variable 'interface' from source: set_fact 15247 1726867254.17689: variable 'interface' from source: set_fact 15247 1726867254.17695: variable 'network_state' from source: role '' defaults 15247 1726867254.17739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867254.17845: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867254.17874: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867254.17897: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867254.17920: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867254.17951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867254.17972: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867254.17992: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.18012: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867254.18028: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15247 1726867254.18031: when evaluation is False, skipping this task 15247 1726867254.18034: _execute() done 15247 1726867254.18036: dumping result to json 15247 1726867254.18039: done dumping result, returning 15247 1726867254.18048: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-8ce3-1923-00000000003e] 15247 1726867254.18053: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000003e 15247 1726867254.18151: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000003e 15247 1726867254.18157: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15247 1726867254.18244: no more pending results, returning what we have 15247 1726867254.18247: results queue empty 15247 1726867254.18248: checking for any_errors_fatal 15247 1726867254.18253: done checking for any_errors_fatal 15247 1726867254.18254: checking for max_fail_percentage 15247 1726867254.18256: done checking for max_fail_percentage 15247 1726867254.18257: checking to see if all hosts have failed and the running result is not ok 15247 1726867254.18258: done checking to see if all hosts have failed 15247 1726867254.18258: getting the remaining hosts for this loop 15247 1726867254.18262: done getting the remaining hosts for this loop 15247 1726867254.18266: getting the next task for host managed_node2 15247 1726867254.18270: done getting next task for host managed_node2 15247 1726867254.18273: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15247 1726867254.18275: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867254.18288: getting variables 15247 1726867254.18289: in VariableManager get_vars() 15247 1726867254.18322: Calling all_inventory to load vars for managed_node2 15247 1726867254.18324: Calling groups_inventory to load vars for managed_node2 15247 1726867254.18326: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867254.18334: Calling all_plugins_play to load vars for managed_node2 15247 1726867254.18337: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867254.18339: Calling groups_plugins_play to load vars for managed_node2 15247 1726867254.19155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867254.20340: done with get_vars() 15247 1726867254.20371: done getting variables 15247 1726867254.20459: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:20:54 -0400 (0:00:00.072) 0:00:23.914 ****** 15247 1726867254.20492: entering _queue_task() for managed_node2/dnf 15247 1726867254.20843: worker is 1 (out of 1 available) 15247 1726867254.20862: exiting _queue_task() for managed_node2/dnf 15247 1726867254.20879: done queuing things up, now waiting for results queue to drain 15247 1726867254.20883: waiting for pending results... 15247 1726867254.21116: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15247 1726867254.21194: in run() - task 0affcac9-a3a5-8ce3-1923-00000000003f 15247 1726867254.21197: variable 'ansible_search_path' from source: unknown 15247 1726867254.21204: variable 'ansible_search_path' from source: unknown 15247 1726867254.21383: calling self._execute() 15247 1726867254.21387: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867254.21390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867254.21393: variable 'omit' from source: magic vars 15247 1726867254.21753: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.21762: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867254.21895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867254.24153: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867254.24206: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867254.24233: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867254.24258: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867254.24279: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867254.24340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.24359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.24380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.24407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.24420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.24497: variable 'ansible_distribution' from source: facts 15247 1726867254.24501: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.24515: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15247 1726867254.24590: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867254.24675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.24694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.24713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.24739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.24752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.24780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.24796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.24814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.24838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.24850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.24881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.24896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.24915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.24939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.24949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.25050: variable 'network_connections' from source: play vars 15247 1726867254.25058: variable 'profile' from source: play vars 15247 1726867254.25106: variable 'profile' from source: play vars 15247 1726867254.25112: variable 'interface' from source: set_fact 15247 1726867254.25153: variable 'interface' from source: set_fact 15247 1726867254.25205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867254.25329: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867254.25355: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867254.25378: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867254.25404: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867254.25435: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867254.25450: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867254.25471: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.25491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867254.25530: variable '__network_team_connections_defined' from source: role '' defaults 15247 1726867254.25676: variable 'network_connections' from source: play vars 15247 1726867254.25681: variable 'profile' from source: play vars 15247 1726867254.25730: variable 'profile' from source: play vars 15247 1726867254.25733: variable 'interface' from source: set_fact 15247 1726867254.25982: variable 'interface' from source: set_fact 15247 1726867254.25985: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15247 1726867254.25988: when evaluation is False, skipping this task 15247 1726867254.25991: _execute() done 15247 1726867254.25993: dumping result to json 15247 1726867254.25996: done dumping result, returning 15247 1726867254.25999: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-8ce3-1923-00000000003f] 15247 1726867254.26001: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000003f 15247 1726867254.26071: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000003f 15247 1726867254.26075: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15247 1726867254.26299: no more pending results, returning what we have 15247 1726867254.26302: results queue empty 15247 1726867254.26303: checking for any_errors_fatal 15247 1726867254.26310: done checking for any_errors_fatal 15247 1726867254.26311: checking for max_fail_percentage 15247 1726867254.26313: done checking for max_fail_percentage 15247 1726867254.26314: checking to see if all hosts have failed and the running result is not ok 15247 1726867254.26315: done checking to see if all hosts have failed 15247 1726867254.26316: getting the remaining hosts for this loop 15247 1726867254.26317: done getting the remaining hosts for this loop 15247 1726867254.26320: getting the next task for host managed_node2 15247 1726867254.26325: done getting next task for host managed_node2 15247 1726867254.26328: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15247 1726867254.26330: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867254.26342: getting variables 15247 1726867254.26343: in VariableManager get_vars() 15247 1726867254.26376: Calling all_inventory to load vars for managed_node2 15247 1726867254.26381: Calling groups_inventory to load vars for managed_node2 15247 1726867254.26383: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867254.26392: Calling all_plugins_play to load vars for managed_node2 15247 1726867254.26395: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867254.26397: Calling groups_plugins_play to load vars for managed_node2 15247 1726867254.27653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867254.29270: done with get_vars() 15247 1726867254.29291: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15247 1726867254.29362: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:20:54 -0400 (0:00:00.088) 0:00:24.003 ****** 15247 1726867254.29390: entering _queue_task() for managed_node2/yum 15247 1726867254.29644: worker is 1 (out of 1 available) 15247 1726867254.29656: exiting _queue_task() for managed_node2/yum 15247 1726867254.29666: done queuing things up, now waiting for results queue to drain 15247 1726867254.29668: waiting for pending results... 15247 1726867254.29942: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15247 1726867254.30044: in run() - task 0affcac9-a3a5-8ce3-1923-000000000040 15247 1726867254.30065: variable 'ansible_search_path' from source: unknown 15247 1726867254.30072: variable 'ansible_search_path' from source: unknown 15247 1726867254.30116: calling self._execute() 15247 1726867254.30226: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867254.30238: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867254.30253: variable 'omit' from source: magic vars 15247 1726867254.30656: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.30673: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867254.30862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867254.33063: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867254.33140: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867254.33183: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867254.33232: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867254.33283: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867254.33352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.33393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.33432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.33557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.33560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.33602: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.33623: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15247 1726867254.33631: when evaluation is False, skipping this task 15247 1726867254.33640: _execute() done 15247 1726867254.33648: dumping result to json 15247 1726867254.33656: done dumping result, returning 15247 1726867254.33670: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-8ce3-1923-000000000040] 15247 1726867254.33685: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000040 15247 1726867254.33852: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000040 15247 1726867254.33855: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15247 1726867254.33937: no more pending results, returning what we have 15247 1726867254.33941: results queue empty 15247 1726867254.33942: checking for any_errors_fatal 15247 1726867254.33949: done checking for any_errors_fatal 15247 1726867254.33950: checking for max_fail_percentage 15247 1726867254.33952: done checking for max_fail_percentage 15247 1726867254.33953: checking to see if all hosts have failed and the running result is not ok 15247 1726867254.33954: done checking to see if all hosts have failed 15247 1726867254.33954: getting the remaining hosts for this loop 15247 1726867254.33956: done getting the remaining hosts for this loop 15247 1726867254.33960: getting the next task for host managed_node2 15247 1726867254.33967: done getting next task for host managed_node2 15247 1726867254.33970: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15247 1726867254.33972: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867254.33987: getting variables 15247 1726867254.33989: in VariableManager get_vars() 15247 1726867254.34028: Calling all_inventory to load vars for managed_node2 15247 1726867254.34030: Calling groups_inventory to load vars for managed_node2 15247 1726867254.34033: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867254.34043: Calling all_plugins_play to load vars for managed_node2 15247 1726867254.34046: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867254.34048: Calling groups_plugins_play to load vars for managed_node2 15247 1726867254.35697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867254.37248: done with get_vars() 15247 1726867254.37272: done getting variables 15247 1726867254.37335: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:20:54 -0400 (0:00:00.079) 0:00:24.083 ****** 15247 1726867254.37369: entering _queue_task() for managed_node2/fail 15247 1726867254.37788: worker is 1 (out of 1 available) 15247 1726867254.37799: exiting _queue_task() for managed_node2/fail 15247 1726867254.37811: done queuing things up, now waiting for results queue to drain 15247 1726867254.37812: waiting for pending results... 15247 1726867254.38094: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15247 1726867254.38098: in run() - task 0affcac9-a3a5-8ce3-1923-000000000041 15247 1726867254.38122: variable 'ansible_search_path' from source: unknown 15247 1726867254.38129: variable 'ansible_search_path' from source: unknown 15247 1726867254.38165: calling self._execute() 15247 1726867254.38263: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867254.38275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867254.38294: variable 'omit' from source: magic vars 15247 1726867254.38759: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.38763: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867254.38835: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867254.39044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867254.41286: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867254.41363: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867254.41411: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867254.41453: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867254.41491: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867254.41579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.41620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.41653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.41703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.41882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.41886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.41888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.41890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.41892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.41894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.41937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.41966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.41998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.42047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.42069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.42256: variable 'network_connections' from source: play vars 15247 1726867254.42272: variable 'profile' from source: play vars 15247 1726867254.42345: variable 'profile' from source: play vars 15247 1726867254.42355: variable 'interface' from source: set_fact 15247 1726867254.42425: variable 'interface' from source: set_fact 15247 1726867254.42496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867254.42686: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867254.42728: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867254.42763: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867254.42801: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867254.42848: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867254.42880: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867254.42914: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.42984: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867254.42998: variable '__network_team_connections_defined' from source: role '' defaults 15247 1726867254.43248: variable 'network_connections' from source: play vars 15247 1726867254.43259: variable 'profile' from source: play vars 15247 1726867254.43330: variable 'profile' from source: play vars 15247 1726867254.43341: variable 'interface' from source: set_fact 15247 1726867254.43479: variable 'interface' from source: set_fact 15247 1726867254.43484: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15247 1726867254.43487: when evaluation is False, skipping this task 15247 1726867254.43489: _execute() done 15247 1726867254.43491: dumping result to json 15247 1726867254.43493: done dumping result, returning 15247 1726867254.43495: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-8ce3-1923-000000000041] 15247 1726867254.43504: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000041 15247 1726867254.43689: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000041 15247 1726867254.43692: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15247 1726867254.43774: no more pending results, returning what we have 15247 1726867254.43780: results queue empty 15247 1726867254.43781: checking for any_errors_fatal 15247 1726867254.43787: done checking for any_errors_fatal 15247 1726867254.43787: checking for max_fail_percentage 15247 1726867254.43790: done checking for max_fail_percentage 15247 1726867254.43791: checking to see if all hosts have failed and the running result is not ok 15247 1726867254.43791: done checking to see if all hosts have failed 15247 1726867254.43792: getting the remaining hosts for this loop 15247 1726867254.43794: done getting the remaining hosts for this loop 15247 1726867254.43797: getting the next task for host managed_node2 15247 1726867254.43804: done getting next task for host managed_node2 15247 1726867254.43810: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15247 1726867254.43812: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867254.43825: getting variables 15247 1726867254.43827: in VariableManager get_vars() 15247 1726867254.43863: Calling all_inventory to load vars for managed_node2 15247 1726867254.43865: Calling groups_inventory to load vars for managed_node2 15247 1726867254.43868: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867254.43981: Calling all_plugins_play to load vars for managed_node2 15247 1726867254.43986: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867254.43989: Calling groups_plugins_play to load vars for managed_node2 15247 1726867254.45361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867254.46927: done with get_vars() 15247 1726867254.46949: done getting variables 15247 1726867254.47012: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:20:54 -0400 (0:00:00.096) 0:00:24.179 ****** 15247 1726867254.47042: entering _queue_task() for managed_node2/package 15247 1726867254.47320: worker is 1 (out of 1 available) 15247 1726867254.47333: exiting _queue_task() for managed_node2/package 15247 1726867254.47344: done queuing things up, now waiting for results queue to drain 15247 1726867254.47345: waiting for pending results... 15247 1726867254.47793: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 15247 1726867254.47798: in run() - task 0affcac9-a3a5-8ce3-1923-000000000042 15247 1726867254.47802: variable 'ansible_search_path' from source: unknown 15247 1726867254.47804: variable 'ansible_search_path' from source: unknown 15247 1726867254.47810: calling self._execute() 15247 1726867254.47888: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867254.47900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867254.47916: variable 'omit' from source: magic vars 15247 1726867254.48272: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.48289: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867254.48481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867254.48738: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867254.48793: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867254.48834: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867254.48871: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867254.48984: variable 'network_packages' from source: role '' defaults 15247 1726867254.49099: variable '__network_provider_setup' from source: role '' defaults 15247 1726867254.49122: variable '__network_service_name_default_nm' from source: role '' defaults 15247 1726867254.49188: variable '__network_service_name_default_nm' from source: role '' defaults 15247 1726867254.49202: variable '__network_packages_default_nm' from source: role '' defaults 15247 1726867254.49334: variable '__network_packages_default_nm' from source: role '' defaults 15247 1726867254.49503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867254.56699: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867254.56983: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867254.56987: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867254.56989: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867254.56991: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867254.56993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.56996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.57072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.57128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.57150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.57199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.57234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.57262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.57449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.57468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.57701: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15247 1726867254.57829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.57857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.57891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.57937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.57959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.58061: variable 'ansible_python' from source: facts 15247 1726867254.58182: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15247 1726867254.58185: variable '__network_wpa_supplicant_required' from source: role '' defaults 15247 1726867254.58272: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15247 1726867254.58409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.58446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.58480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.58530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.58549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.58601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.58649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.58681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.58727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.58749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.58967: variable 'network_connections' from source: play vars 15247 1726867254.58970: variable 'profile' from source: play vars 15247 1726867254.59024: variable 'profile' from source: play vars 15247 1726867254.59038: variable 'interface' from source: set_fact 15247 1726867254.59120: variable 'interface' from source: set_fact 15247 1726867254.59197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867254.59233: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867254.59272: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.59318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867254.59359: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867254.59663: variable 'network_connections' from source: play vars 15247 1726867254.59673: variable 'profile' from source: play vars 15247 1726867254.59840: variable 'profile' from source: play vars 15247 1726867254.59844: variable 'interface' from source: set_fact 15247 1726867254.59969: variable 'interface' from source: set_fact 15247 1726867254.60012: variable '__network_packages_default_wireless' from source: role '' defaults 15247 1726867254.60100: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867254.60789: variable 'network_connections' from source: play vars 15247 1726867254.60898: variable 'profile' from source: play vars 15247 1726867254.60901: variable 'profile' from source: play vars 15247 1726867254.60904: variable 'interface' from source: set_fact 15247 1726867254.61167: variable 'interface' from source: set_fact 15247 1726867254.61202: variable '__network_packages_default_team' from source: role '' defaults 15247 1726867254.61400: variable '__network_team_connections_defined' from source: role '' defaults 15247 1726867254.61959: variable 'network_connections' from source: play vars 15247 1726867254.61970: variable 'profile' from source: play vars 15247 1726867254.62101: variable 'profile' from source: play vars 15247 1726867254.62113: variable 'interface' from source: set_fact 15247 1726867254.62219: variable 'interface' from source: set_fact 15247 1726867254.62282: variable '__network_service_name_default_initscripts' from source: role '' defaults 15247 1726867254.62345: variable '__network_service_name_default_initscripts' from source: role '' defaults 15247 1726867254.62356: variable '__network_packages_default_initscripts' from source: role '' defaults 15247 1726867254.62422: variable '__network_packages_default_initscripts' from source: role '' defaults 15247 1726867254.62636: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15247 1726867254.63089: variable 'network_connections' from source: play vars 15247 1726867254.63098: variable 'profile' from source: play vars 15247 1726867254.63163: variable 'profile' from source: play vars 15247 1726867254.63173: variable 'interface' from source: set_fact 15247 1726867254.63246: variable 'interface' from source: set_fact 15247 1726867254.63260: variable 'ansible_distribution' from source: facts 15247 1726867254.63267: variable '__network_rh_distros' from source: role '' defaults 15247 1726867254.63276: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.63295: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15247 1726867254.63469: variable 'ansible_distribution' from source: facts 15247 1726867254.63481: variable '__network_rh_distros' from source: role '' defaults 15247 1726867254.63493: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.63512: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15247 1726867254.63682: variable 'ansible_distribution' from source: facts 15247 1726867254.63691: variable '__network_rh_distros' from source: role '' defaults 15247 1726867254.63702: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.63743: variable 'network_provider' from source: set_fact 15247 1726867254.63761: variable 'ansible_facts' from source: unknown 15247 1726867254.64435: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15247 1726867254.64442: when evaluation is False, skipping this task 15247 1726867254.64448: _execute() done 15247 1726867254.64455: dumping result to json 15247 1726867254.64462: done dumping result, returning 15247 1726867254.64473: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-8ce3-1923-000000000042] 15247 1726867254.64483: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000042 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15247 1726867254.64626: no more pending results, returning what we have 15247 1726867254.64629: results queue empty 15247 1726867254.64630: checking for any_errors_fatal 15247 1726867254.64635: done checking for any_errors_fatal 15247 1726867254.64636: checking for max_fail_percentage 15247 1726867254.64638: done checking for max_fail_percentage 15247 1726867254.64639: checking to see if all hosts have failed and the running result is not ok 15247 1726867254.64640: done checking to see if all hosts have failed 15247 1726867254.64640: getting the remaining hosts for this loop 15247 1726867254.64642: done getting the remaining hosts for this loop 15247 1726867254.64645: getting the next task for host managed_node2 15247 1726867254.64650: done getting next task for host managed_node2 15247 1726867254.64653: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15247 1726867254.64655: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867254.64669: getting variables 15247 1726867254.64671: in VariableManager get_vars() 15247 1726867254.64711: Calling all_inventory to load vars for managed_node2 15247 1726867254.64714: Calling groups_inventory to load vars for managed_node2 15247 1726867254.64716: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867254.64725: Calling all_plugins_play to load vars for managed_node2 15247 1726867254.64732: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867254.64735: Calling groups_plugins_play to load vars for managed_node2 15247 1726867254.65500: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000042 15247 1726867254.65503: WORKER PROCESS EXITING 15247 1726867254.70588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867254.72096: done with get_vars() 15247 1726867254.72122: done getting variables 15247 1726867254.72168: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:20:54 -0400 (0:00:00.251) 0:00:24.431 ****** 15247 1726867254.72192: entering _queue_task() for managed_node2/package 15247 1726867254.72515: worker is 1 (out of 1 available) 15247 1726867254.72527: exiting _queue_task() for managed_node2/package 15247 1726867254.72541: done queuing things up, now waiting for results queue to drain 15247 1726867254.72542: waiting for pending results... 15247 1726867254.72821: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15247 1726867254.72951: in run() - task 0affcac9-a3a5-8ce3-1923-000000000043 15247 1726867254.72971: variable 'ansible_search_path' from source: unknown 15247 1726867254.72982: variable 'ansible_search_path' from source: unknown 15247 1726867254.73027: calling self._execute() 15247 1726867254.73131: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867254.73143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867254.73158: variable 'omit' from source: magic vars 15247 1726867254.73560: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.73576: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867254.73709: variable 'network_state' from source: role '' defaults 15247 1726867254.73724: Evaluated conditional (network_state != {}): False 15247 1726867254.73731: when evaluation is False, skipping this task 15247 1726867254.73739: _execute() done 15247 1726867254.73745: dumping result to json 15247 1726867254.73751: done dumping result, returning 15247 1726867254.73767: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-8ce3-1923-000000000043] 15247 1726867254.73779: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000043 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15247 1726867254.73930: no more pending results, returning what we have 15247 1726867254.73934: results queue empty 15247 1726867254.73935: checking for any_errors_fatal 15247 1726867254.73946: done checking for any_errors_fatal 15247 1726867254.73947: checking for max_fail_percentage 15247 1726867254.73949: done checking for max_fail_percentage 15247 1726867254.73950: checking to see if all hosts have failed and the running result is not ok 15247 1726867254.73951: done checking to see if all hosts have failed 15247 1726867254.73951: getting the remaining hosts for this loop 15247 1726867254.73953: done getting the remaining hosts for this loop 15247 1726867254.73957: getting the next task for host managed_node2 15247 1726867254.73964: done getting next task for host managed_node2 15247 1726867254.73967: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15247 1726867254.73969: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867254.73986: getting variables 15247 1726867254.73988: in VariableManager get_vars() 15247 1726867254.74026: Calling all_inventory to load vars for managed_node2 15247 1726867254.74029: Calling groups_inventory to load vars for managed_node2 15247 1726867254.74031: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867254.74043: Calling all_plugins_play to load vars for managed_node2 15247 1726867254.74046: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867254.74048: Calling groups_plugins_play to load vars for managed_node2 15247 1726867254.74858: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000043 15247 1726867254.74862: WORKER PROCESS EXITING 15247 1726867254.75582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867254.77152: done with get_vars() 15247 1726867254.77172: done getting variables 15247 1726867254.77227: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:20:54 -0400 (0:00:00.050) 0:00:24.482 ****** 15247 1726867254.77254: entering _queue_task() for managed_node2/package 15247 1726867254.77516: worker is 1 (out of 1 available) 15247 1726867254.77526: exiting _queue_task() for managed_node2/package 15247 1726867254.77539: done queuing things up, now waiting for results queue to drain 15247 1726867254.77540: waiting for pending results... 15247 1726867254.77810: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15247 1726867254.77928: in run() - task 0affcac9-a3a5-8ce3-1923-000000000044 15247 1726867254.77949: variable 'ansible_search_path' from source: unknown 15247 1726867254.77957: variable 'ansible_search_path' from source: unknown 15247 1726867254.78002: calling self._execute() 15247 1726867254.78112: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867254.78115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867254.78183: variable 'omit' from source: magic vars 15247 1726867254.78506: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.78524: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867254.78650: variable 'network_state' from source: role '' defaults 15247 1726867254.78667: Evaluated conditional (network_state != {}): False 15247 1726867254.78764: when evaluation is False, skipping this task 15247 1726867254.78767: _execute() done 15247 1726867254.78769: dumping result to json 15247 1726867254.78771: done dumping result, returning 15247 1726867254.78774: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-8ce3-1923-000000000044] 15247 1726867254.78776: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000044 15247 1726867254.78843: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000044 15247 1726867254.78845: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15247 1726867254.78890: no more pending results, returning what we have 15247 1726867254.78894: results queue empty 15247 1726867254.78895: checking for any_errors_fatal 15247 1726867254.78903: done checking for any_errors_fatal 15247 1726867254.78904: checking for max_fail_percentage 15247 1726867254.78905: done checking for max_fail_percentage 15247 1726867254.78909: checking to see if all hosts have failed and the running result is not ok 15247 1726867254.78910: done checking to see if all hosts have failed 15247 1726867254.78910: getting the remaining hosts for this loop 15247 1726867254.78912: done getting the remaining hosts for this loop 15247 1726867254.78915: getting the next task for host managed_node2 15247 1726867254.78921: done getting next task for host managed_node2 15247 1726867254.78924: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15247 1726867254.78926: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867254.78941: getting variables 15247 1726867254.78943: in VariableManager get_vars() 15247 1726867254.78979: Calling all_inventory to load vars for managed_node2 15247 1726867254.78982: Calling groups_inventory to load vars for managed_node2 15247 1726867254.78984: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867254.78995: Calling all_plugins_play to load vars for managed_node2 15247 1726867254.78998: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867254.79001: Calling groups_plugins_play to load vars for managed_node2 15247 1726867254.80544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867254.82101: done with get_vars() 15247 1726867254.82123: done getting variables 15247 1726867254.82183: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:20:54 -0400 (0:00:00.049) 0:00:24.531 ****** 15247 1726867254.82216: entering _queue_task() for managed_node2/service 15247 1726867254.82464: worker is 1 (out of 1 available) 15247 1726867254.82476: exiting _queue_task() for managed_node2/service 15247 1726867254.82690: done queuing things up, now waiting for results queue to drain 15247 1726867254.82691: waiting for pending results... 15247 1726867254.82896: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15247 1726867254.82902: in run() - task 0affcac9-a3a5-8ce3-1923-000000000045 15247 1726867254.82929: variable 'ansible_search_path' from source: unknown 15247 1726867254.82937: variable 'ansible_search_path' from source: unknown 15247 1726867254.82978: calling self._execute() 15247 1726867254.83088: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867254.83101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867254.83121: variable 'omit' from source: magic vars 15247 1726867254.83505: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.83524: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867254.83640: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867254.83832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867254.86065: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867254.86113: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867254.86159: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867254.86211: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867254.86245: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867254.86331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.86391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.86401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.86448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.86467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.86582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.86585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.86588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.86626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.86645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.86689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.86725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.86753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.86796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.86820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.87001: variable 'network_connections' from source: play vars 15247 1726867254.87020: variable 'profile' from source: play vars 15247 1726867254.87152: variable 'profile' from source: play vars 15247 1726867254.87156: variable 'interface' from source: set_fact 15247 1726867254.87168: variable 'interface' from source: set_fact 15247 1726867254.87245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867254.87435: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867254.87482: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867254.87518: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867254.87550: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867254.87600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867254.87629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867254.87658: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.87693: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867254.87982: variable '__network_team_connections_defined' from source: role '' defaults 15247 1726867254.87986: variable 'network_connections' from source: play vars 15247 1726867254.87996: variable 'profile' from source: play vars 15247 1726867254.88061: variable 'profile' from source: play vars 15247 1726867254.88069: variable 'interface' from source: set_fact 15247 1726867254.88140: variable 'interface' from source: set_fact 15247 1726867254.88168: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15247 1726867254.88176: when evaluation is False, skipping this task 15247 1726867254.88186: _execute() done 15247 1726867254.88194: dumping result to json 15247 1726867254.88201: done dumping result, returning 15247 1726867254.88220: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-8ce3-1923-000000000045] 15247 1726867254.88240: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000045 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15247 1726867254.88383: no more pending results, returning what we have 15247 1726867254.88386: results queue empty 15247 1726867254.88387: checking for any_errors_fatal 15247 1726867254.88393: done checking for any_errors_fatal 15247 1726867254.88394: checking for max_fail_percentage 15247 1726867254.88396: done checking for max_fail_percentage 15247 1726867254.88397: checking to see if all hosts have failed and the running result is not ok 15247 1726867254.88397: done checking to see if all hosts have failed 15247 1726867254.88398: getting the remaining hosts for this loop 15247 1726867254.88399: done getting the remaining hosts for this loop 15247 1726867254.88403: getting the next task for host managed_node2 15247 1726867254.88412: done getting next task for host managed_node2 15247 1726867254.88415: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15247 1726867254.88417: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867254.88428: getting variables 15247 1726867254.88430: in VariableManager get_vars() 15247 1726867254.88467: Calling all_inventory to load vars for managed_node2 15247 1726867254.88470: Calling groups_inventory to load vars for managed_node2 15247 1726867254.88472: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867254.88483: Calling all_plugins_play to load vars for managed_node2 15247 1726867254.88486: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867254.88489: Calling groups_plugins_play to load vars for managed_node2 15247 1726867254.89245: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000045 15247 1726867254.89249: WORKER PROCESS EXITING 15247 1726867254.90210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867254.91913: done with get_vars() 15247 1726867254.91934: done getting variables 15247 1726867254.91991: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:20:54 -0400 (0:00:00.098) 0:00:24.629 ****** 15247 1726867254.92023: entering _queue_task() for managed_node2/service 15247 1726867254.92298: worker is 1 (out of 1 available) 15247 1726867254.92312: exiting _queue_task() for managed_node2/service 15247 1726867254.92323: done queuing things up, now waiting for results queue to drain 15247 1726867254.92324: waiting for pending results... 15247 1726867254.92597: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15247 1726867254.92704: in run() - task 0affcac9-a3a5-8ce3-1923-000000000046 15247 1726867254.92725: variable 'ansible_search_path' from source: unknown 15247 1726867254.92733: variable 'ansible_search_path' from source: unknown 15247 1726867254.92770: calling self._execute() 15247 1726867254.92871: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867254.92886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867254.92900: variable 'omit' from source: magic vars 15247 1726867254.93283: variable 'ansible_distribution_major_version' from source: facts 15247 1726867254.93299: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867254.93487: variable 'network_provider' from source: set_fact 15247 1726867254.93497: variable 'network_state' from source: role '' defaults 15247 1726867254.93512: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15247 1726867254.93523: variable 'omit' from source: magic vars 15247 1726867254.93569: variable 'omit' from source: magic vars 15247 1726867254.93669: variable 'network_service_name' from source: role '' defaults 15247 1726867254.93680: variable 'network_service_name' from source: role '' defaults 15247 1726867254.93793: variable '__network_provider_setup' from source: role '' defaults 15247 1726867254.93804: variable '__network_service_name_default_nm' from source: role '' defaults 15247 1726867254.93867: variable '__network_service_name_default_nm' from source: role '' defaults 15247 1726867254.93883: variable '__network_packages_default_nm' from source: role '' defaults 15247 1726867254.93950: variable '__network_packages_default_nm' from source: role '' defaults 15247 1726867254.94185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867254.96344: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867254.96425: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867254.96467: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867254.96518: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867254.96547: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867254.96630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.96664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.96700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.96747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.96765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.96821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.96848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.96875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.96927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.96948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.97179: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15247 1726867254.97304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.97341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.97369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.97426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.97438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.97534: variable 'ansible_python' from source: facts 15247 1726867254.97643: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15247 1726867254.97646: variable '__network_wpa_supplicant_required' from source: role '' defaults 15247 1726867254.97729: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15247 1726867254.97863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.97895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.97925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.97970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.97991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.98042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867254.98083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867254.98114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.98154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867254.98171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867254.98319: variable 'network_connections' from source: play vars 15247 1726867254.98331: variable 'profile' from source: play vars 15247 1726867254.98482: variable 'profile' from source: play vars 15247 1726867254.98485: variable 'interface' from source: set_fact 15247 1726867254.98487: variable 'interface' from source: set_fact 15247 1726867254.98590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867254.98790: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867254.98849: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867254.98897: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867254.98947: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867254.99014: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867254.99050: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867254.99095: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867254.99134: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867254.99188: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867254.99469: variable 'network_connections' from source: play vars 15247 1726867254.99491: variable 'profile' from source: play vars 15247 1726867254.99599: variable 'profile' from source: play vars 15247 1726867254.99602: variable 'interface' from source: set_fact 15247 1726867254.99642: variable 'interface' from source: set_fact 15247 1726867254.99676: variable '__network_packages_default_wireless' from source: role '' defaults 15247 1726867254.99764: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867255.00068: variable 'network_connections' from source: play vars 15247 1726867255.00142: variable 'profile' from source: play vars 15247 1726867255.00157: variable 'profile' from source: play vars 15247 1726867255.00166: variable 'interface' from source: set_fact 15247 1726867255.00243: variable 'interface' from source: set_fact 15247 1726867255.00279: variable '__network_packages_default_team' from source: role '' defaults 15247 1726867255.00364: variable '__network_team_connections_defined' from source: role '' defaults 15247 1726867255.00664: variable 'network_connections' from source: play vars 15247 1726867255.00673: variable 'profile' from source: play vars 15247 1726867255.00750: variable 'profile' from source: play vars 15247 1726867255.00760: variable 'interface' from source: set_fact 15247 1726867255.00841: variable 'interface' from source: set_fact 15247 1726867255.00903: variable '__network_service_name_default_initscripts' from source: role '' defaults 15247 1726867255.01011: variable '__network_service_name_default_initscripts' from source: role '' defaults 15247 1726867255.01014: variable '__network_packages_default_initscripts' from source: role '' defaults 15247 1726867255.01046: variable '__network_packages_default_initscripts' from source: role '' defaults 15247 1726867255.01276: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15247 1726867255.01772: variable 'network_connections' from source: play vars 15247 1726867255.01785: variable 'profile' from source: play vars 15247 1726867255.01851: variable 'profile' from source: play vars 15247 1726867255.01881: variable 'interface' from source: set_fact 15247 1726867255.01944: variable 'interface' from source: set_fact 15247 1726867255.01957: variable 'ansible_distribution' from source: facts 15247 1726867255.01983: variable '__network_rh_distros' from source: role '' defaults 15247 1726867255.01986: variable 'ansible_distribution_major_version' from source: facts 15247 1726867255.01994: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15247 1726867255.02182: variable 'ansible_distribution' from source: facts 15247 1726867255.02199: variable '__network_rh_distros' from source: role '' defaults 15247 1726867255.02282: variable 'ansible_distribution_major_version' from source: facts 15247 1726867255.02286: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15247 1726867255.02405: variable 'ansible_distribution' from source: facts 15247 1726867255.02422: variable '__network_rh_distros' from source: role '' defaults 15247 1726867255.02432: variable 'ansible_distribution_major_version' from source: facts 15247 1726867255.02469: variable 'network_provider' from source: set_fact 15247 1726867255.02498: variable 'omit' from source: magic vars 15247 1726867255.02536: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867255.02566: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867255.02590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867255.02615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867255.02633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867255.02666: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867255.02673: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867255.02683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867255.02787: Set connection var ansible_shell_executable to /bin/sh 15247 1726867255.02851: Set connection var ansible_connection to ssh 15247 1726867255.02854: Set connection var ansible_shell_type to sh 15247 1726867255.02856: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867255.02858: Set connection var ansible_timeout to 10 15247 1726867255.02859: Set connection var ansible_pipelining to False 15247 1726867255.02861: variable 'ansible_shell_executable' from source: unknown 15247 1726867255.02862: variable 'ansible_connection' from source: unknown 15247 1726867255.02864: variable 'ansible_module_compression' from source: unknown 15247 1726867255.02866: variable 'ansible_shell_type' from source: unknown 15247 1726867255.02871: variable 'ansible_shell_executable' from source: unknown 15247 1726867255.02880: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867255.02892: variable 'ansible_pipelining' from source: unknown 15247 1726867255.02899: variable 'ansible_timeout' from source: unknown 15247 1726867255.02908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867255.03024: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867255.03042: variable 'omit' from source: magic vars 15247 1726867255.03069: starting attempt loop 15247 1726867255.03072: running the handler 15247 1726867255.03180: variable 'ansible_facts' from source: unknown 15247 1726867255.03901: _low_level_execute_command(): starting 15247 1726867255.03917: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867255.04635: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867255.04655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867255.04671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867255.04697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867255.04762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867255.04816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867255.04834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867255.04864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867255.04948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867255.06651: stdout chunk (state=3): >>>/root <<< 15247 1726867255.06786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867255.06797: stdout chunk (state=3): >>><<< 15247 1726867255.06811: stderr chunk (state=3): >>><<< 15247 1726867255.06835: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867255.06860: _low_level_execute_command(): starting 15247 1726867255.06871: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867255.0684745-16407-75872643272161 `" && echo ansible-tmp-1726867255.0684745-16407-75872643272161="` echo /root/.ansible/tmp/ansible-tmp-1726867255.0684745-16407-75872643272161 `" ) && sleep 0' 15247 1726867255.07487: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867255.07503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867255.07530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867255.07548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867255.07566: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867255.07646: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867255.07670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867255.07691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867255.07716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867255.07783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867255.09779: stdout chunk (state=3): >>>ansible-tmp-1726867255.0684745-16407-75872643272161=/root/.ansible/tmp/ansible-tmp-1726867255.0684745-16407-75872643272161 <<< 15247 1726867255.09933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867255.09937: stdout chunk (state=3): >>><<< 15247 1726867255.09939: stderr chunk (state=3): >>><<< 15247 1726867255.09955: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867255.0684745-16407-75872643272161=/root/.ansible/tmp/ansible-tmp-1726867255.0684745-16407-75872643272161 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867255.10014: variable 'ansible_module_compression' from source: unknown 15247 1726867255.10125: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15247 1726867255.10128: variable 'ansible_facts' from source: unknown 15247 1726867255.10340: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867255.0684745-16407-75872643272161/AnsiballZ_systemd.py 15247 1726867255.10559: Sending initial data 15247 1726867255.10563: Sent initial data (155 bytes) 15247 1726867255.11164: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867255.11167: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867255.11170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867255.11172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867255.11174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867255.11178: stderr chunk (state=3): >>>debug2: match not found <<< 15247 1726867255.11181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867255.11183: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867255.11254: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867255.11257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867255.11299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867255.11358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867255.12973: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15247 1726867255.12976: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867255.13017: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867255.13056: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpae7pinh9 /root/.ansible/tmp/ansible-tmp-1726867255.0684745-16407-75872643272161/AnsiballZ_systemd.py <<< 15247 1726867255.13059: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867255.0684745-16407-75872643272161/AnsiballZ_systemd.py" <<< 15247 1726867255.13096: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpae7pinh9" to remote "/root/.ansible/tmp/ansible-tmp-1726867255.0684745-16407-75872643272161/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867255.0684745-16407-75872643272161/AnsiballZ_systemd.py" <<< 15247 1726867255.14184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867255.14232: stderr chunk (state=3): >>><<< 15247 1726867255.14235: stdout chunk (state=3): >>><<< 15247 1726867255.14237: done transferring module to remote 15247 1726867255.14244: _low_level_execute_command(): starting 15247 1726867255.14248: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867255.0684745-16407-75872643272161/ /root/.ansible/tmp/ansible-tmp-1726867255.0684745-16407-75872643272161/AnsiballZ_systemd.py && sleep 0' 15247 1726867255.14635: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867255.14669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867255.14672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867255.14674: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867255.14676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867255.14681: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867255.14722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867255.14725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867255.14780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867255.16603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867255.16622: stderr chunk (state=3): >>><<< 15247 1726867255.16625: stdout chunk (state=3): >>><<< 15247 1726867255.16636: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867255.16638: _low_level_execute_command(): starting 15247 1726867255.16643: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867255.0684745-16407-75872643272161/AnsiballZ_systemd.py && sleep 0' 15247 1726867255.17031: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867255.17039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867255.17042: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867255.17044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867255.17092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867255.17095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867255.17142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867255.46704: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4517888", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3308986368", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "740524000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredump<<< 15247 1726867255.46711: stdout chunk (state=3): >>>Receive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-<<< 15247 1726867255.46719: stdout chunk (state=3): >>>broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15247 1726867255.48616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867255.48644: stderr chunk (state=3): >>><<< 15247 1726867255.48647: stdout chunk (state=3): >>><<< 15247 1726867255.48661: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4517888", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3308986368", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "740524000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867255.48778: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867255.0684745-16407-75872643272161/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867255.48793: _low_level_execute_command(): starting 15247 1726867255.48796: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867255.0684745-16407-75872643272161/ > /dev/null 2>&1 && sleep 0' 15247 1726867255.49390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867255.49431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867255.49447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15247 1726867255.49459: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 15247 1726867255.49553: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867255.49633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867255.49674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867255.51534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867255.51556: stderr chunk (state=3): >>><<< 15247 1726867255.51559: stdout chunk (state=3): >>><<< 15247 1726867255.51570: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867255.51579: handler run complete 15247 1726867255.51621: attempt loop complete, returning result 15247 1726867255.51624: _execute() done 15247 1726867255.51626: dumping result to json 15247 1726867255.51638: done dumping result, returning 15247 1726867255.51647: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-8ce3-1923-000000000046] 15247 1726867255.51653: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000046 15247 1726867255.51890: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000046 15247 1726867255.51896: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15247 1726867255.51950: no more pending results, returning what we have 15247 1726867255.51953: results queue empty 15247 1726867255.51954: checking for any_errors_fatal 15247 1726867255.51961: done checking for any_errors_fatal 15247 1726867255.51961: checking for max_fail_percentage 15247 1726867255.51963: done checking for max_fail_percentage 15247 1726867255.51963: checking to see if all hosts have failed and the running result is not ok 15247 1726867255.51964: done checking to see if all hosts have failed 15247 1726867255.51965: getting the remaining hosts for this loop 15247 1726867255.51966: done getting the remaining hosts for this loop 15247 1726867255.51973: getting the next task for host managed_node2 15247 1726867255.51983: done getting next task for host managed_node2 15247 1726867255.51987: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15247 1726867255.51989: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867255.51998: getting variables 15247 1726867255.51999: in VariableManager get_vars() 15247 1726867255.52034: Calling all_inventory to load vars for managed_node2 15247 1726867255.52036: Calling groups_inventory to load vars for managed_node2 15247 1726867255.52038: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867255.52053: Calling all_plugins_play to load vars for managed_node2 15247 1726867255.52056: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867255.52059: Calling groups_plugins_play to load vars for managed_node2 15247 1726867255.53137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867255.54699: done with get_vars() 15247 1726867255.54725: done getting variables 15247 1726867255.54797: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:20:55 -0400 (0:00:00.627) 0:00:25.257 ****** 15247 1726867255.54824: entering _queue_task() for managed_node2/service 15247 1726867255.55142: worker is 1 (out of 1 available) 15247 1726867255.55155: exiting _queue_task() for managed_node2/service 15247 1726867255.55167: done queuing things up, now waiting for results queue to drain 15247 1726867255.55168: waiting for pending results... 15247 1726867255.55379: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15247 1726867255.55595: in run() - task 0affcac9-a3a5-8ce3-1923-000000000047 15247 1726867255.55600: variable 'ansible_search_path' from source: unknown 15247 1726867255.55603: variable 'ansible_search_path' from source: unknown 15247 1726867255.55605: calling self._execute() 15247 1726867255.55678: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867255.55722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867255.55727: variable 'omit' from source: magic vars 15247 1726867255.56145: variable 'ansible_distribution_major_version' from source: facts 15247 1726867255.56149: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867255.56241: variable 'network_provider' from source: set_fact 15247 1726867255.56245: Evaluated conditional (network_provider == "nm"): True 15247 1726867255.56357: variable '__network_wpa_supplicant_required' from source: role '' defaults 15247 1726867255.56428: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15247 1726867255.56550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867255.58331: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867255.58376: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867255.58403: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867255.58430: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867255.58451: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867255.58510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867255.58530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867255.58547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867255.58575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867255.58588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867255.58621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867255.58637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867255.58653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867255.58681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867255.58693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867255.58721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867255.58737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867255.58753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867255.58779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867255.58791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867255.58878: variable 'network_connections' from source: play vars 15247 1726867255.58887: variable 'profile' from source: play vars 15247 1726867255.58933: variable 'profile' from source: play vars 15247 1726867255.58936: variable 'interface' from source: set_fact 15247 1726867255.58992: variable 'interface' from source: set_fact 15247 1726867255.59040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867255.59156: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867255.59183: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867255.59204: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867255.59231: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867255.59258: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867255.59273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867255.59292: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867255.59312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867255.59347: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867255.59492: variable 'network_connections' from source: play vars 15247 1726867255.59495: variable 'profile' from source: play vars 15247 1726867255.59542: variable 'profile' from source: play vars 15247 1726867255.59545: variable 'interface' from source: set_fact 15247 1726867255.59587: variable 'interface' from source: set_fact 15247 1726867255.59611: Evaluated conditional (__network_wpa_supplicant_required): False 15247 1726867255.59614: when evaluation is False, skipping this task 15247 1726867255.59617: _execute() done 15247 1726867255.59628: dumping result to json 15247 1726867255.59631: done dumping result, returning 15247 1726867255.59633: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-8ce3-1923-000000000047] 15247 1726867255.59635: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000047 15247 1726867255.59714: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000047 15247 1726867255.59717: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15247 1726867255.59758: no more pending results, returning what we have 15247 1726867255.59761: results queue empty 15247 1726867255.59762: checking for any_errors_fatal 15247 1726867255.59781: done checking for any_errors_fatal 15247 1726867255.59782: checking for max_fail_percentage 15247 1726867255.59784: done checking for max_fail_percentage 15247 1726867255.59785: checking to see if all hosts have failed and the running result is not ok 15247 1726867255.59785: done checking to see if all hosts have failed 15247 1726867255.59786: getting the remaining hosts for this loop 15247 1726867255.59787: done getting the remaining hosts for this loop 15247 1726867255.59791: getting the next task for host managed_node2 15247 1726867255.59796: done getting next task for host managed_node2 15247 1726867255.59800: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15247 1726867255.59801: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867255.59816: getting variables 15247 1726867255.59817: in VariableManager get_vars() 15247 1726867255.59850: Calling all_inventory to load vars for managed_node2 15247 1726867255.59853: Calling groups_inventory to load vars for managed_node2 15247 1726867255.59855: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867255.59863: Calling all_plugins_play to load vars for managed_node2 15247 1726867255.59866: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867255.59868: Calling groups_plugins_play to load vars for managed_node2 15247 1726867255.61170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867255.62694: done with get_vars() 15247 1726867255.62718: done getting variables 15247 1726867255.62770: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:20:55 -0400 (0:00:00.079) 0:00:25.337 ****** 15247 1726867255.62802: entering _queue_task() for managed_node2/service 15247 1726867255.63094: worker is 1 (out of 1 available) 15247 1726867255.63109: exiting _queue_task() for managed_node2/service 15247 1726867255.63122: done queuing things up, now waiting for results queue to drain 15247 1726867255.63124: waiting for pending results... 15247 1726867255.63504: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 15247 1726867255.63527: in run() - task 0affcac9-a3a5-8ce3-1923-000000000048 15247 1726867255.63550: variable 'ansible_search_path' from source: unknown 15247 1726867255.63559: variable 'ansible_search_path' from source: unknown 15247 1726867255.63784: calling self._execute() 15247 1726867255.63788: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867255.63791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867255.63793: variable 'omit' from source: magic vars 15247 1726867255.64151: variable 'ansible_distribution_major_version' from source: facts 15247 1726867255.64171: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867255.64297: variable 'network_provider' from source: set_fact 15247 1726867255.64415: Evaluated conditional (network_provider == "initscripts"): False 15247 1726867255.64427: when evaluation is False, skipping this task 15247 1726867255.64464: _execute() done 15247 1726867255.64474: dumping result to json 15247 1726867255.64486: done dumping result, returning 15247 1726867255.64532: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-8ce3-1923-000000000048] 15247 1726867255.64536: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000048 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15247 1726867255.64809: no more pending results, returning what we have 15247 1726867255.64812: results queue empty 15247 1726867255.64813: checking for any_errors_fatal 15247 1726867255.64819: done checking for any_errors_fatal 15247 1726867255.64820: checking for max_fail_percentage 15247 1726867255.64822: done checking for max_fail_percentage 15247 1726867255.64822: checking to see if all hosts have failed and the running result is not ok 15247 1726867255.64823: done checking to see if all hosts have failed 15247 1726867255.64824: getting the remaining hosts for this loop 15247 1726867255.64825: done getting the remaining hosts for this loop 15247 1726867255.64828: getting the next task for host managed_node2 15247 1726867255.64832: done getting next task for host managed_node2 15247 1726867255.64835: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15247 1726867255.64837: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867255.64849: getting variables 15247 1726867255.64850: in VariableManager get_vars() 15247 1726867255.64884: Calling all_inventory to load vars for managed_node2 15247 1726867255.64886: Calling groups_inventory to load vars for managed_node2 15247 1726867255.64889: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867255.64897: Calling all_plugins_play to load vars for managed_node2 15247 1726867255.64899: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867255.64902: Calling groups_plugins_play to load vars for managed_node2 15247 1726867255.65460: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000048 15247 1726867255.65463: WORKER PROCESS EXITING 15247 1726867255.66226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867255.67972: done with get_vars() 15247 1726867255.67997: done getting variables 15247 1726867255.68061: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:20:55 -0400 (0:00:00.052) 0:00:25.390 ****** 15247 1726867255.68092: entering _queue_task() for managed_node2/copy 15247 1726867255.68368: worker is 1 (out of 1 available) 15247 1726867255.68382: exiting _queue_task() for managed_node2/copy 15247 1726867255.68394: done queuing things up, now waiting for results queue to drain 15247 1726867255.68396: waiting for pending results... 15247 1726867255.68689: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15247 1726867255.68818: in run() - task 0affcac9-a3a5-8ce3-1923-000000000049 15247 1726867255.68836: variable 'ansible_search_path' from source: unknown 15247 1726867255.68845: variable 'ansible_search_path' from source: unknown 15247 1726867255.68886: calling self._execute() 15247 1726867255.69002: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867255.69022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867255.69042: variable 'omit' from source: magic vars 15247 1726867255.69467: variable 'ansible_distribution_major_version' from source: facts 15247 1726867255.69471: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867255.69595: variable 'network_provider' from source: set_fact 15247 1726867255.69609: Evaluated conditional (network_provider == "initscripts"): False 15247 1726867255.69686: when evaluation is False, skipping this task 15247 1726867255.69693: _execute() done 15247 1726867255.69696: dumping result to json 15247 1726867255.69699: done dumping result, returning 15247 1726867255.69702: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-8ce3-1923-000000000049] 15247 1726867255.69705: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000049 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15247 1726867255.69934: no more pending results, returning what we have 15247 1726867255.69938: results queue empty 15247 1726867255.69939: checking for any_errors_fatal 15247 1726867255.69945: done checking for any_errors_fatal 15247 1726867255.69948: checking for max_fail_percentage 15247 1726867255.69950: done checking for max_fail_percentage 15247 1726867255.69951: checking to see if all hosts have failed and the running result is not ok 15247 1726867255.69952: done checking to see if all hosts have failed 15247 1726867255.69953: getting the remaining hosts for this loop 15247 1726867255.69954: done getting the remaining hosts for this loop 15247 1726867255.69958: getting the next task for host managed_node2 15247 1726867255.69965: done getting next task for host managed_node2 15247 1726867255.69968: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15247 1726867255.69971: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867255.69987: getting variables 15247 1726867255.69993: in VariableManager get_vars() 15247 1726867255.70033: Calling all_inventory to load vars for managed_node2 15247 1726867255.70035: Calling groups_inventory to load vars for managed_node2 15247 1726867255.70038: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867255.70050: Calling all_plugins_play to load vars for managed_node2 15247 1726867255.70055: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867255.70059: Calling groups_plugins_play to load vars for managed_node2 15247 1726867255.70592: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000049 15247 1726867255.70595: WORKER PROCESS EXITING 15247 1726867255.71548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867255.73132: done with get_vars() 15247 1726867255.73153: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:20:55 -0400 (0:00:00.051) 0:00:25.441 ****** 15247 1726867255.73239: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 15247 1726867255.73504: worker is 1 (out of 1 available) 15247 1726867255.73519: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 15247 1726867255.73531: done queuing things up, now waiting for results queue to drain 15247 1726867255.73533: waiting for pending results... 15247 1726867255.73811: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15247 1726867255.73910: in run() - task 0affcac9-a3a5-8ce3-1923-00000000004a 15247 1726867255.73934: variable 'ansible_search_path' from source: unknown 15247 1726867255.73948: variable 'ansible_search_path' from source: unknown 15247 1726867255.73991: calling self._execute() 15247 1726867255.74100: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867255.74117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867255.74132: variable 'omit' from source: magic vars 15247 1726867255.74526: variable 'ansible_distribution_major_version' from source: facts 15247 1726867255.74550: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867255.74564: variable 'omit' from source: magic vars 15247 1726867255.74613: variable 'omit' from source: magic vars 15247 1726867255.74811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867255.76964: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867255.77143: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867255.77146: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867255.77149: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867255.77150: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867255.77229: variable 'network_provider' from source: set_fact 15247 1726867255.77365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867255.77425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867255.77463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867255.77513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867255.77533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867255.77617: variable 'omit' from source: magic vars 15247 1726867255.77738: variable 'omit' from source: magic vars 15247 1726867255.77845: variable 'network_connections' from source: play vars 15247 1726867255.77861: variable 'profile' from source: play vars 15247 1726867255.77931: variable 'profile' from source: play vars 15247 1726867255.77942: variable 'interface' from source: set_fact 15247 1726867255.78025: variable 'interface' from source: set_fact 15247 1726867255.78172: variable 'omit' from source: magic vars 15247 1726867255.78188: variable '__lsr_ansible_managed' from source: task vars 15247 1726867255.78350: variable '__lsr_ansible_managed' from source: task vars 15247 1726867255.78428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15247 1726867255.78960: Loaded config def from plugin (lookup/template) 15247 1726867255.78969: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15247 1726867255.79013: File lookup term: get_ansible_managed.j2 15247 1726867255.79022: variable 'ansible_search_path' from source: unknown 15247 1726867255.79031: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15247 1726867255.79052: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15247 1726867255.79076: variable 'ansible_search_path' from source: unknown 15247 1726867255.84941: variable 'ansible_managed' from source: unknown 15247 1726867255.85060: variable 'omit' from source: magic vars 15247 1726867255.85093: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867255.85155: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867255.85158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867255.85168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867255.85185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867255.85218: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867255.85227: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867255.85236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867255.85372: Set connection var ansible_shell_executable to /bin/sh 15247 1726867255.85375: Set connection var ansible_connection to ssh 15247 1726867255.85380: Set connection var ansible_shell_type to sh 15247 1726867255.85382: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867255.85384: Set connection var ansible_timeout to 10 15247 1726867255.85386: Set connection var ansible_pipelining to False 15247 1726867255.85409: variable 'ansible_shell_executable' from source: unknown 15247 1726867255.85417: variable 'ansible_connection' from source: unknown 15247 1726867255.85483: variable 'ansible_module_compression' from source: unknown 15247 1726867255.85486: variable 'ansible_shell_type' from source: unknown 15247 1726867255.85488: variable 'ansible_shell_executable' from source: unknown 15247 1726867255.85490: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867255.85492: variable 'ansible_pipelining' from source: unknown 15247 1726867255.85493: variable 'ansible_timeout' from source: unknown 15247 1726867255.85495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867255.85583: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867255.85612: variable 'omit' from source: magic vars 15247 1726867255.85623: starting attempt loop 15247 1726867255.85630: running the handler 15247 1726867255.85646: _low_level_execute_command(): starting 15247 1726867255.85659: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867255.86374: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867255.86475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867255.86486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867255.86503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867255.86582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867255.88250: stdout chunk (state=3): >>>/root <<< 15247 1726867255.88410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867255.88413: stdout chunk (state=3): >>><<< 15247 1726867255.88416: stderr chunk (state=3): >>><<< 15247 1726867255.88433: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867255.88448: _low_level_execute_command(): starting 15247 1726867255.88525: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867255.8843827-16460-18835651737858 `" && echo ansible-tmp-1726867255.8843827-16460-18835651737858="` echo /root/.ansible/tmp/ansible-tmp-1726867255.8843827-16460-18835651737858 `" ) && sleep 0' 15247 1726867255.89063: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867255.89135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867255.89149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867255.89200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867255.89221: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867255.89248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867255.89318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867255.91271: stdout chunk (state=3): >>>ansible-tmp-1726867255.8843827-16460-18835651737858=/root/.ansible/tmp/ansible-tmp-1726867255.8843827-16460-18835651737858 <<< 15247 1726867255.91423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867255.91426: stdout chunk (state=3): >>><<< 15247 1726867255.91428: stderr chunk (state=3): >>><<< 15247 1726867255.91543: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867255.8843827-16460-18835651737858=/root/.ansible/tmp/ansible-tmp-1726867255.8843827-16460-18835651737858 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867255.91546: variable 'ansible_module_compression' from source: unknown 15247 1726867255.91549: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15247 1726867255.91579: variable 'ansible_facts' from source: unknown 15247 1726867255.91698: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867255.8843827-16460-18835651737858/AnsiballZ_network_connections.py 15247 1726867255.91901: Sending initial data 15247 1726867255.91905: Sent initial data (167 bytes) 15247 1726867255.92448: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867255.92565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867255.92595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867255.92615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867255.92687: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867255.94250: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15247 1726867255.94268: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867255.94326: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867255.94359: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpqfvgn_aq /root/.ansible/tmp/ansible-tmp-1726867255.8843827-16460-18835651737858/AnsiballZ_network_connections.py <<< 15247 1726867255.94380: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867255.8843827-16460-18835651737858/AnsiballZ_network_connections.py" <<< 15247 1726867255.94422: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpqfvgn_aq" to remote "/root/.ansible/tmp/ansible-tmp-1726867255.8843827-16460-18835651737858/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867255.8843827-16460-18835651737858/AnsiballZ_network_connections.py" <<< 15247 1726867255.95437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867255.95487: stderr chunk (state=3): >>><<< 15247 1726867255.95490: stdout chunk (state=3): >>><<< 15247 1726867255.95492: done transferring module to remote 15247 1726867255.95502: _low_level_execute_command(): starting 15247 1726867255.95515: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867255.8843827-16460-18835651737858/ /root/.ansible/tmp/ansible-tmp-1726867255.8843827-16460-18835651737858/AnsiballZ_network_connections.py && sleep 0' 15247 1726867255.96100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867255.96124: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867255.96142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867255.96161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867255.96247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867255.96291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867255.96361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867255.98261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867255.98276: stdout chunk (state=3): >>><<< 15247 1726867255.98292: stderr chunk (state=3): >>><<< 15247 1726867255.98315: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867255.98329: _low_level_execute_command(): starting 15247 1726867255.98338: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867255.8843827-16460-18835651737858/AnsiballZ_network_connections.py && sleep 0' 15247 1726867255.98933: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867255.98946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867255.98960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867255.99043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867255.99081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867255.99104: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867255.99123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867255.99266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867256.30705: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15247 1726867256.33186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867256.33201: stderr chunk (state=3): >>>Shared connection to 10.31.12.116 closed. <<< 15247 1726867256.33384: stdout chunk (state=3): >>><<< 15247 1726867256.33387: stderr chunk (state=3): >>><<< 15247 1726867256.33391: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867256.33393: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867255.8843827-16460-18835651737858/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867256.33396: _low_level_execute_command(): starting 15247 1726867256.33398: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867255.8843827-16460-18835651737858/ > /dev/null 2>&1 && sleep 0' 15247 1726867256.34617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867256.34631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867256.34642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867256.34882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867256.34886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867256.34986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867256.36854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867256.36858: stdout chunk (state=3): >>><<< 15247 1726867256.36939: stderr chunk (state=3): >>><<< 15247 1726867256.37100: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867256.37103: handler run complete 15247 1726867256.37140: attempt loop complete, returning result 15247 1726867256.37148: _execute() done 15247 1726867256.37154: dumping result to json 15247 1726867256.37159: done dumping result, returning 15247 1726867256.37173: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-8ce3-1923-00000000004a] 15247 1726867256.37176: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000004a 15247 1726867256.37601: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000004a 15247 1726867256.37604: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15247 1726867256.37696: no more pending results, returning what we have 15247 1726867256.37699: results queue empty 15247 1726867256.37701: checking for any_errors_fatal 15247 1726867256.37706: done checking for any_errors_fatal 15247 1726867256.37707: checking for max_fail_percentage 15247 1726867256.37735: done checking for max_fail_percentage 15247 1726867256.37736: checking to see if all hosts have failed and the running result is not ok 15247 1726867256.37737: done checking to see if all hosts have failed 15247 1726867256.37738: getting the remaining hosts for this loop 15247 1726867256.37740: done getting the remaining hosts for this loop 15247 1726867256.37748: getting the next task for host managed_node2 15247 1726867256.37755: done getting next task for host managed_node2 15247 1726867256.37759: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15247 1726867256.37766: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867256.37782: getting variables 15247 1726867256.37784: in VariableManager get_vars() 15247 1726867256.37941: Calling all_inventory to load vars for managed_node2 15247 1726867256.37944: Calling groups_inventory to load vars for managed_node2 15247 1726867256.37947: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867256.37958: Calling all_plugins_play to load vars for managed_node2 15247 1726867256.37961: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867256.37964: Calling groups_plugins_play to load vars for managed_node2 15247 1726867256.40776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867256.42978: done with get_vars() 15247 1726867256.43134: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:20:56 -0400 (0:00:00.700) 0:00:26.142 ****** 15247 1726867256.43290: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 15247 1726867256.43807: worker is 1 (out of 1 available) 15247 1726867256.43819: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 15247 1726867256.43830: done queuing things up, now waiting for results queue to drain 15247 1726867256.43831: waiting for pending results... 15247 1726867256.44071: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 15247 1726867256.44440: in run() - task 0affcac9-a3a5-8ce3-1923-00000000004b 15247 1726867256.44444: variable 'ansible_search_path' from source: unknown 15247 1726867256.44447: variable 'ansible_search_path' from source: unknown 15247 1726867256.44548: calling self._execute() 15247 1726867256.44653: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867256.44713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867256.44767: variable 'omit' from source: magic vars 15247 1726867256.45157: variable 'ansible_distribution_major_version' from source: facts 15247 1726867256.45464: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867256.45529: variable 'network_state' from source: role '' defaults 15247 1726867256.45713: Evaluated conditional (network_state != {}): False 15247 1726867256.45740: when evaluation is False, skipping this task 15247 1726867256.45748: _execute() done 15247 1726867256.45773: dumping result to json 15247 1726867256.45784: done dumping result, returning 15247 1726867256.45795: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-8ce3-1923-00000000004b] 15247 1726867256.45912: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000004b 15247 1726867256.46121: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000004b skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15247 1726867256.46246: no more pending results, returning what we have 15247 1726867256.46251: results queue empty 15247 1726867256.46253: checking for any_errors_fatal 15247 1726867256.46267: done checking for any_errors_fatal 15247 1726867256.46268: checking for max_fail_percentage 15247 1726867256.46270: done checking for max_fail_percentage 15247 1726867256.46271: checking to see if all hosts have failed and the running result is not ok 15247 1726867256.46272: done checking to see if all hosts have failed 15247 1726867256.46272: getting the remaining hosts for this loop 15247 1726867256.46275: done getting the remaining hosts for this loop 15247 1726867256.46285: getting the next task for host managed_node2 15247 1726867256.46291: done getting next task for host managed_node2 15247 1726867256.46295: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15247 1726867256.46298: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867256.46317: getting variables 15247 1726867256.46319: in VariableManager get_vars() 15247 1726867256.46362: Calling all_inventory to load vars for managed_node2 15247 1726867256.46365: Calling groups_inventory to load vars for managed_node2 15247 1726867256.46368: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867256.46711: Calling all_plugins_play to load vars for managed_node2 15247 1726867256.46716: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867256.46720: Calling groups_plugins_play to load vars for managed_node2 15247 1726867256.47391: WORKER PROCESS EXITING 15247 1726867256.49666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867256.53051: done with get_vars() 15247 1726867256.53082: done getting variables 15247 1726867256.53264: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:20:56 -0400 (0:00:00.100) 0:00:26.242 ****** 15247 1726867256.53299: entering _queue_task() for managed_node2/debug 15247 1726867256.54119: worker is 1 (out of 1 available) 15247 1726867256.54131: exiting _queue_task() for managed_node2/debug 15247 1726867256.54144: done queuing things up, now waiting for results queue to drain 15247 1726867256.54146: waiting for pending results... 15247 1726867256.54550: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15247 1726867256.54995: in run() - task 0affcac9-a3a5-8ce3-1923-00000000004c 15247 1726867256.54999: variable 'ansible_search_path' from source: unknown 15247 1726867256.55001: variable 'ansible_search_path' from source: unknown 15247 1726867256.55004: calling self._execute() 15247 1726867256.55153: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867256.55384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867256.55387: variable 'omit' from source: magic vars 15247 1726867256.56045: variable 'ansible_distribution_major_version' from source: facts 15247 1726867256.56063: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867256.56095: variable 'omit' from source: magic vars 15247 1726867256.56234: variable 'omit' from source: magic vars 15247 1726867256.56279: variable 'omit' from source: magic vars 15247 1726867256.56344: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867256.56449: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867256.56503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867256.56603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867256.56624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867256.56852: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867256.56856: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867256.56859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867256.57069: Set connection var ansible_shell_executable to /bin/sh 15247 1726867256.57073: Set connection var ansible_connection to ssh 15247 1726867256.57075: Set connection var ansible_shell_type to sh 15247 1726867256.57079: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867256.57081: Set connection var ansible_timeout to 10 15247 1726867256.57084: Set connection var ansible_pipelining to False 15247 1726867256.57086: variable 'ansible_shell_executable' from source: unknown 15247 1726867256.57088: variable 'ansible_connection' from source: unknown 15247 1726867256.57091: variable 'ansible_module_compression' from source: unknown 15247 1726867256.57093: variable 'ansible_shell_type' from source: unknown 15247 1726867256.57095: variable 'ansible_shell_executable' from source: unknown 15247 1726867256.57097: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867256.57100: variable 'ansible_pipelining' from source: unknown 15247 1726867256.57102: variable 'ansible_timeout' from source: unknown 15247 1726867256.57104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867256.57412: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867256.57507: variable 'omit' from source: magic vars 15247 1726867256.57582: starting attempt loop 15247 1726867256.57586: running the handler 15247 1726867256.57689: variable '__network_connections_result' from source: set_fact 15247 1726867256.57861: handler run complete 15247 1726867256.57953: attempt loop complete, returning result 15247 1726867256.57961: _execute() done 15247 1726867256.57969: dumping result to json 15247 1726867256.57976: done dumping result, returning 15247 1726867256.57991: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-8ce3-1923-00000000004c] 15247 1726867256.58000: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000004c ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 15247 1726867256.58269: no more pending results, returning what we have 15247 1726867256.58273: results queue empty 15247 1726867256.58274: checking for any_errors_fatal 15247 1726867256.58283: done checking for any_errors_fatal 15247 1726867256.58283: checking for max_fail_percentage 15247 1726867256.58286: done checking for max_fail_percentage 15247 1726867256.58287: checking to see if all hosts have failed and the running result is not ok 15247 1726867256.58287: done checking to see if all hosts have failed 15247 1726867256.58288: getting the remaining hosts for this loop 15247 1726867256.58290: done getting the remaining hosts for this loop 15247 1726867256.58294: getting the next task for host managed_node2 15247 1726867256.58301: done getting next task for host managed_node2 15247 1726867256.58305: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15247 1726867256.58307: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867256.58319: getting variables 15247 1726867256.58321: in VariableManager get_vars() 15247 1726867256.58472: Calling all_inventory to load vars for managed_node2 15247 1726867256.58475: Calling groups_inventory to load vars for managed_node2 15247 1726867256.58480: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867256.58491: Calling all_plugins_play to load vars for managed_node2 15247 1726867256.58495: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867256.58498: Calling groups_plugins_play to load vars for managed_node2 15247 1726867256.59495: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000004c 15247 1726867256.59498: WORKER PROCESS EXITING 15247 1726867256.61743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867256.65334: done with get_vars() 15247 1726867256.65360: done getting variables 15247 1726867256.65539: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:20:56 -0400 (0:00:00.122) 0:00:26.365 ****** 15247 1726867256.65571: entering _queue_task() for managed_node2/debug 15247 1726867256.66304: worker is 1 (out of 1 available) 15247 1726867256.66319: exiting _queue_task() for managed_node2/debug 15247 1726867256.66331: done queuing things up, now waiting for results queue to drain 15247 1726867256.66333: waiting for pending results... 15247 1726867256.66958: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15247 1726867256.67081: in run() - task 0affcac9-a3a5-8ce3-1923-00000000004d 15247 1726867256.67200: variable 'ansible_search_path' from source: unknown 15247 1726867256.67214: variable 'ansible_search_path' from source: unknown 15247 1726867256.67260: calling self._execute() 15247 1726867256.67550: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867256.67563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867256.67580: variable 'omit' from source: magic vars 15247 1726867256.68347: variable 'ansible_distribution_major_version' from source: facts 15247 1726867256.68396: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867256.68430: variable 'omit' from source: magic vars 15247 1726867256.68638: variable 'omit' from source: magic vars 15247 1726867256.68642: variable 'omit' from source: magic vars 15247 1726867256.68784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867256.68803: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867256.68832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867256.68876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867256.69073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867256.69076: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867256.69079: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867256.69081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867256.69241: Set connection var ansible_shell_executable to /bin/sh 15247 1726867256.69292: Set connection var ansible_connection to ssh 15247 1726867256.69299: Set connection var ansible_shell_type to sh 15247 1726867256.69308: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867256.69320: Set connection var ansible_timeout to 10 15247 1726867256.69328: Set connection var ansible_pipelining to False 15247 1726867256.69352: variable 'ansible_shell_executable' from source: unknown 15247 1726867256.69402: variable 'ansible_connection' from source: unknown 15247 1726867256.69414: variable 'ansible_module_compression' from source: unknown 15247 1726867256.69423: variable 'ansible_shell_type' from source: unknown 15247 1726867256.69582: variable 'ansible_shell_executable' from source: unknown 15247 1726867256.69585: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867256.69588: variable 'ansible_pipelining' from source: unknown 15247 1726867256.69590: variable 'ansible_timeout' from source: unknown 15247 1726867256.69592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867256.69763: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867256.69842: variable 'omit' from source: magic vars 15247 1726867256.69851: starting attempt loop 15247 1726867256.69857: running the handler 15247 1726867256.69906: variable '__network_connections_result' from source: set_fact 15247 1726867256.70121: variable '__network_connections_result' from source: set_fact 15247 1726867256.70407: handler run complete 15247 1726867256.70418: attempt loop complete, returning result 15247 1726867256.70427: _execute() done 15247 1726867256.70434: dumping result to json 15247 1726867256.70443: done dumping result, returning 15247 1726867256.70457: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-8ce3-1923-00000000004d] 15247 1726867256.70468: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000004d 15247 1726867256.70983: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000004d 15247 1726867256.70986: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15247 1726867256.71073: no more pending results, returning what we have 15247 1726867256.71080: results queue empty 15247 1726867256.71081: checking for any_errors_fatal 15247 1726867256.71089: done checking for any_errors_fatal 15247 1726867256.71089: checking for max_fail_percentage 15247 1726867256.71091: done checking for max_fail_percentage 15247 1726867256.71092: checking to see if all hosts have failed and the running result is not ok 15247 1726867256.71094: done checking to see if all hosts have failed 15247 1726867256.71094: getting the remaining hosts for this loop 15247 1726867256.71097: done getting the remaining hosts for this loop 15247 1726867256.71101: getting the next task for host managed_node2 15247 1726867256.71108: done getting next task for host managed_node2 15247 1726867256.71113: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15247 1726867256.71116: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867256.71126: getting variables 15247 1726867256.71128: in VariableManager get_vars() 15247 1726867256.71165: Calling all_inventory to load vars for managed_node2 15247 1726867256.71168: Calling groups_inventory to load vars for managed_node2 15247 1726867256.71170: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867256.71292: Calling all_plugins_play to load vars for managed_node2 15247 1726867256.71297: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867256.71301: Calling groups_plugins_play to load vars for managed_node2 15247 1726867256.74150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867256.77501: done with get_vars() 15247 1726867256.77528: done getting variables 15247 1726867256.77685: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:20:56 -0400 (0:00:00.121) 0:00:26.486 ****** 15247 1726867256.77726: entering _queue_task() for managed_node2/debug 15247 1726867256.78308: worker is 1 (out of 1 available) 15247 1726867256.78321: exiting _queue_task() for managed_node2/debug 15247 1726867256.78332: done queuing things up, now waiting for results queue to drain 15247 1726867256.78333: waiting for pending results... 15247 1726867256.79140: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15247 1726867256.79484: in run() - task 0affcac9-a3a5-8ce3-1923-00000000004e 15247 1726867256.79489: variable 'ansible_search_path' from source: unknown 15247 1726867256.79491: variable 'ansible_search_path' from source: unknown 15247 1726867256.79494: calling self._execute() 15247 1726867256.79768: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867256.79784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867256.79800: variable 'omit' from source: magic vars 15247 1726867256.80683: variable 'ansible_distribution_major_version' from source: facts 15247 1726867256.80686: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867256.81083: variable 'network_state' from source: role '' defaults 15247 1726867256.81086: Evaluated conditional (network_state != {}): False 15247 1726867256.81088: when evaluation is False, skipping this task 15247 1726867256.81090: _execute() done 15247 1726867256.81092: dumping result to json 15247 1726867256.81094: done dumping result, returning 15247 1726867256.81096: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-8ce3-1923-00000000004e] 15247 1726867256.81098: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000004e 15247 1726867256.81162: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000004e 15247 1726867256.81165: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 15247 1726867256.81215: no more pending results, returning what we have 15247 1726867256.81219: results queue empty 15247 1726867256.81221: checking for any_errors_fatal 15247 1726867256.81230: done checking for any_errors_fatal 15247 1726867256.81230: checking for max_fail_percentage 15247 1726867256.81232: done checking for max_fail_percentage 15247 1726867256.81233: checking to see if all hosts have failed and the running result is not ok 15247 1726867256.81234: done checking to see if all hosts have failed 15247 1726867256.81235: getting the remaining hosts for this loop 15247 1726867256.81236: done getting the remaining hosts for this loop 15247 1726867256.81239: getting the next task for host managed_node2 15247 1726867256.81246: done getting next task for host managed_node2 15247 1726867256.81249: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15247 1726867256.81252: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867256.81267: getting variables 15247 1726867256.81269: in VariableManager get_vars() 15247 1726867256.81308: Calling all_inventory to load vars for managed_node2 15247 1726867256.81313: Calling groups_inventory to load vars for managed_node2 15247 1726867256.81316: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867256.81328: Calling all_plugins_play to load vars for managed_node2 15247 1726867256.81331: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867256.81334: Calling groups_plugins_play to load vars for managed_node2 15247 1726867256.84696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867256.89525: done with get_vars() 15247 1726867256.89549: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:20:56 -0400 (0:00:00.119) 0:00:26.606 ****** 15247 1726867256.89647: entering _queue_task() for managed_node2/ping 15247 1726867256.90796: worker is 1 (out of 1 available) 15247 1726867256.90808: exiting _queue_task() for managed_node2/ping 15247 1726867256.90824: done queuing things up, now waiting for results queue to drain 15247 1726867256.90825: waiting for pending results... 15247 1726867256.91320: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 15247 1726867256.91505: in run() - task 0affcac9-a3a5-8ce3-1923-00000000004f 15247 1726867256.91685: variable 'ansible_search_path' from source: unknown 15247 1726867256.91688: variable 'ansible_search_path' from source: unknown 15247 1726867256.91691: calling self._execute() 15247 1726867256.91839: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867256.91851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867256.91912: variable 'omit' from source: magic vars 15247 1726867256.92713: variable 'ansible_distribution_major_version' from source: facts 15247 1726867256.92730: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867256.92781: variable 'omit' from source: magic vars 15247 1726867256.92896: variable 'omit' from source: magic vars 15247 1726867256.92941: variable 'omit' from source: magic vars 15247 1726867256.93042: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867256.93145: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867256.93171: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867256.93199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867256.93361: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867256.93365: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867256.93367: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867256.93369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867256.93583: Set connection var ansible_shell_executable to /bin/sh 15247 1726867256.93592: Set connection var ansible_connection to ssh 15247 1726867256.93697: Set connection var ansible_shell_type to sh 15247 1726867256.93700: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867256.93702: Set connection var ansible_timeout to 10 15247 1726867256.93704: Set connection var ansible_pipelining to False 15247 1726867256.93806: variable 'ansible_shell_executable' from source: unknown 15247 1726867256.93810: variable 'ansible_connection' from source: unknown 15247 1726867256.93812: variable 'ansible_module_compression' from source: unknown 15247 1726867256.93815: variable 'ansible_shell_type' from source: unknown 15247 1726867256.93817: variable 'ansible_shell_executable' from source: unknown 15247 1726867256.93819: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867256.93821: variable 'ansible_pipelining' from source: unknown 15247 1726867256.93823: variable 'ansible_timeout' from source: unknown 15247 1726867256.93825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867256.94256: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867256.94273: variable 'omit' from source: magic vars 15247 1726867256.94300: starting attempt loop 15247 1726867256.94401: running the handler 15247 1726867256.94404: _low_level_execute_command(): starting 15247 1726867256.94406: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867256.96240: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867256.96255: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867256.96286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867256.96368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867256.98037: stdout chunk (state=3): >>>/root <<< 15247 1726867256.98182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867256.98185: stdout chunk (state=3): >>><<< 15247 1726867256.98196: stderr chunk (state=3): >>><<< 15247 1726867256.98219: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867256.98427: _low_level_execute_command(): starting 15247 1726867256.98432: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867256.9822698-16497-218012547132604 `" && echo ansible-tmp-1726867256.9822698-16497-218012547132604="` echo /root/.ansible/tmp/ansible-tmp-1726867256.9822698-16497-218012547132604 `" ) && sleep 0' 15247 1726867256.99388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867256.99454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867256.99468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867256.99490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867256.99640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867256.99801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867256.99837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867257.01730: stdout chunk (state=3): >>>ansible-tmp-1726867256.9822698-16497-218012547132604=/root/.ansible/tmp/ansible-tmp-1726867256.9822698-16497-218012547132604 <<< 15247 1726867257.01894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867257.01898: stdout chunk (state=3): >>><<< 15247 1726867257.01904: stderr chunk (state=3): >>><<< 15247 1726867257.01952: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867256.9822698-16497-218012547132604=/root/.ansible/tmp/ansible-tmp-1726867256.9822698-16497-218012547132604 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867257.02059: variable 'ansible_module_compression' from source: unknown 15247 1726867257.02102: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15247 1726867257.02137: variable 'ansible_facts' from source: unknown 15247 1726867257.02325: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867256.9822698-16497-218012547132604/AnsiballZ_ping.py 15247 1726867257.02861: Sending initial data 15247 1726867257.02864: Sent initial data (153 bytes) 15247 1726867257.04066: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867257.04085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867257.04091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867257.04110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867257.04117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867257.04401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867257.04405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867257.04407: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867257.04597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867257.04716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867257.06283: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15247 1726867257.06291: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867257.06321: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867257.06366: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpdkvwvno1 /root/.ansible/tmp/ansible-tmp-1726867256.9822698-16497-218012547132604/AnsiballZ_ping.py <<< 15247 1726867257.06701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867256.9822698-16497-218012547132604/AnsiballZ_ping.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpdkvwvno1" to remote "/root/.ansible/tmp/ansible-tmp-1726867256.9822698-16497-218012547132604/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867256.9822698-16497-218012547132604/AnsiballZ_ping.py" <<< 15247 1726867257.07693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867257.07751: stderr chunk (state=3): >>><<< 15247 1726867257.07754: stdout chunk (state=3): >>><<< 15247 1726867257.07775: done transferring module to remote 15247 1726867257.07793: _low_level_execute_command(): starting 15247 1726867257.07796: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867256.9822698-16497-218012547132604/ /root/.ansible/tmp/ansible-tmp-1726867256.9822698-16497-218012547132604/AnsiballZ_ping.py && sleep 0' 15247 1726867257.09016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867257.09031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867257.09143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867257.09181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867257.09322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867257.11091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867257.11095: stdout chunk (state=3): >>><<< 15247 1726867257.11103: stderr chunk (state=3): >>><<< 15247 1726867257.11162: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867257.11166: _low_level_execute_command(): starting 15247 1726867257.11168: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867256.9822698-16497-218012547132604/AnsiballZ_ping.py && sleep 0' 15247 1726867257.12434: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867257.12536: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867257.12685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867257.12753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867257.27884: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15247 1726867257.29264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867257.29268: stdout chunk (state=3): >>><<< 15247 1726867257.29273: stderr chunk (state=3): >>><<< 15247 1726867257.29308: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867257.29330: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867256.9822698-16497-218012547132604/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867257.29339: _low_level_execute_command(): starting 15247 1726867257.29344: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867256.9822698-16497-218012547132604/ > /dev/null 2>&1 && sleep 0' 15247 1726867257.31098: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867257.31143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867257.31165: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867257.31216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867257.31295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867257.33103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867257.33152: stderr chunk (state=3): >>><<< 15247 1726867257.33171: stdout chunk (state=3): >>><<< 15247 1726867257.33202: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867257.33271: handler run complete 15247 1726867257.33297: attempt loop complete, returning result 15247 1726867257.33317: _execute() done 15247 1726867257.33372: dumping result to json 15247 1726867257.33486: done dumping result, returning 15247 1726867257.33489: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-8ce3-1923-00000000004f] 15247 1726867257.33492: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000004f 15247 1726867257.33573: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000004f 15247 1726867257.33884: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 15247 1726867257.33971: no more pending results, returning what we have 15247 1726867257.33975: results queue empty 15247 1726867257.33976: checking for any_errors_fatal 15247 1726867257.33988: done checking for any_errors_fatal 15247 1726867257.33989: checking for max_fail_percentage 15247 1726867257.33990: done checking for max_fail_percentage 15247 1726867257.33991: checking to see if all hosts have failed and the running result is not ok 15247 1726867257.33992: done checking to see if all hosts have failed 15247 1726867257.33993: getting the remaining hosts for this loop 15247 1726867257.33994: done getting the remaining hosts for this loop 15247 1726867257.33997: getting the next task for host managed_node2 15247 1726867257.34005: done getting next task for host managed_node2 15247 1726867257.34007: ^ task is: TASK: meta (role_complete) 15247 1726867257.34008: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867257.34024: getting variables 15247 1726867257.34026: in VariableManager get_vars() 15247 1726867257.34063: Calling all_inventory to load vars for managed_node2 15247 1726867257.34066: Calling groups_inventory to load vars for managed_node2 15247 1726867257.34067: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867257.34190: Calling all_plugins_play to load vars for managed_node2 15247 1726867257.34196: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867257.34226: Calling groups_plugins_play to load vars for managed_node2 15247 1726867257.37942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867257.42172: done with get_vars() 15247 1726867257.42272: done getting variables 15247 1726867257.42471: done queuing things up, now waiting for results queue to drain 15247 1726867257.42473: results queue empty 15247 1726867257.42474: checking for any_errors_fatal 15247 1726867257.42480: done checking for any_errors_fatal 15247 1726867257.42484: checking for max_fail_percentage 15247 1726867257.42485: done checking for max_fail_percentage 15247 1726867257.42486: checking to see if all hosts have failed and the running result is not ok 15247 1726867257.42487: done checking to see if all hosts have failed 15247 1726867257.42487: getting the remaining hosts for this loop 15247 1726867257.42488: done getting the remaining hosts for this loop 15247 1726867257.42493: getting the next task for host managed_node2 15247 1726867257.42497: done getting next task for host managed_node2 15247 1726867257.42498: ^ task is: TASK: meta (flush_handlers) 15247 1726867257.42500: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867257.42505: getting variables 15247 1726867257.42630: in VariableManager get_vars() 15247 1726867257.42643: Calling all_inventory to load vars for managed_node2 15247 1726867257.42645: Calling groups_inventory to load vars for managed_node2 15247 1726867257.42647: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867257.42655: Calling all_plugins_play to load vars for managed_node2 15247 1726867257.42657: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867257.42659: Calling groups_plugins_play to load vars for managed_node2 15247 1726867257.45219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867257.49226: done with get_vars() 15247 1726867257.49251: done getting variables 15247 1726867257.49370: in VariableManager get_vars() 15247 1726867257.49486: Calling all_inventory to load vars for managed_node2 15247 1726867257.49488: Calling groups_inventory to load vars for managed_node2 15247 1726867257.49495: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867257.49509: Calling all_plugins_play to load vars for managed_node2 15247 1726867257.49514: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867257.49521: Calling groups_plugins_play to load vars for managed_node2 15247 1726867257.52684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867257.56435: done with get_vars() 15247 1726867257.56466: done queuing things up, now waiting for results queue to drain 15247 1726867257.56469: results queue empty 15247 1726867257.56469: checking for any_errors_fatal 15247 1726867257.56471: done checking for any_errors_fatal 15247 1726867257.56471: checking for max_fail_percentage 15247 1726867257.56472: done checking for max_fail_percentage 15247 1726867257.56473: checking to see if all hosts have failed and the running result is not ok 15247 1726867257.56474: done checking to see if all hosts have failed 15247 1726867257.56475: getting the remaining hosts for this loop 15247 1726867257.56476: done getting the remaining hosts for this loop 15247 1726867257.56516: getting the next task for host managed_node2 15247 1726867257.56522: done getting next task for host managed_node2 15247 1726867257.56523: ^ task is: TASK: meta (flush_handlers) 15247 1726867257.56525: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867257.56527: getting variables 15247 1726867257.56528: in VariableManager get_vars() 15247 1726867257.56583: Calling all_inventory to load vars for managed_node2 15247 1726867257.56586: Calling groups_inventory to load vars for managed_node2 15247 1726867257.56588: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867257.56593: Calling all_plugins_play to load vars for managed_node2 15247 1726867257.56595: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867257.56598: Calling groups_plugins_play to load vars for managed_node2 15247 1726867257.59363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867257.62928: done with get_vars() 15247 1726867257.62958: done getting variables 15247 1726867257.63010: in VariableManager get_vars() 15247 1726867257.63026: Calling all_inventory to load vars for managed_node2 15247 1726867257.63028: Calling groups_inventory to load vars for managed_node2 15247 1726867257.63030: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867257.63035: Calling all_plugins_play to load vars for managed_node2 15247 1726867257.63038: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867257.63157: Calling groups_plugins_play to load vars for managed_node2 15247 1726867257.65717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867257.69121: done with get_vars() 15247 1726867257.69265: done queuing things up, now waiting for results queue to drain 15247 1726867257.69267: results queue empty 15247 1726867257.69268: checking for any_errors_fatal 15247 1726867257.69270: done checking for any_errors_fatal 15247 1726867257.69270: checking for max_fail_percentage 15247 1726867257.69271: done checking for max_fail_percentage 15247 1726867257.69272: checking to see if all hosts have failed and the running result is not ok 15247 1726867257.69273: done checking to see if all hosts have failed 15247 1726867257.69273: getting the remaining hosts for this loop 15247 1726867257.69274: done getting the remaining hosts for this loop 15247 1726867257.69280: getting the next task for host managed_node2 15247 1726867257.69283: done getting next task for host managed_node2 15247 1726867257.69284: ^ task is: None 15247 1726867257.69286: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867257.69287: done queuing things up, now waiting for results queue to drain 15247 1726867257.69288: results queue empty 15247 1726867257.69288: checking for any_errors_fatal 15247 1726867257.69289: done checking for any_errors_fatal 15247 1726867257.69290: checking for max_fail_percentage 15247 1726867257.69290: done checking for max_fail_percentage 15247 1726867257.69291: checking to see if all hosts have failed and the running result is not ok 15247 1726867257.69292: done checking to see if all hosts have failed 15247 1726867257.69293: getting the next task for host managed_node2 15247 1726867257.69295: done getting next task for host managed_node2 15247 1726867257.69296: ^ task is: None 15247 1726867257.69297: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867257.69414: in VariableManager get_vars() 15247 1726867257.69431: done with get_vars() 15247 1726867257.69437: in VariableManager get_vars() 15247 1726867257.69448: done with get_vars() 15247 1726867257.69453: variable 'omit' from source: magic vars 15247 1726867257.69713: in VariableManager get_vars() 15247 1726867257.69726: done with get_vars() 15247 1726867257.69749: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 15247 1726867257.70064: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15247 1726867257.70242: getting the remaining hosts for this loop 15247 1726867257.70243: done getting the remaining hosts for this loop 15247 1726867257.70246: getting the next task for host managed_node2 15247 1726867257.70248: done getting next task for host managed_node2 15247 1726867257.70250: ^ task is: TASK: Gathering Facts 15247 1726867257.70252: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867257.70254: getting variables 15247 1726867257.70255: in VariableManager get_vars() 15247 1726867257.70263: Calling all_inventory to load vars for managed_node2 15247 1726867257.70266: Calling groups_inventory to load vars for managed_node2 15247 1726867257.70268: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867257.70273: Calling all_plugins_play to load vars for managed_node2 15247 1726867257.70275: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867257.70281: Calling groups_plugins_play to load vars for managed_node2 15247 1726867257.72788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867257.85865: done with get_vars() 15247 1726867257.85892: done getting variables 15247 1726867257.85947: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 17:20:57 -0400 (0:00:00.963) 0:00:27.569 ****** 15247 1726867257.85972: entering _queue_task() for managed_node2/gather_facts 15247 1726867257.86317: worker is 1 (out of 1 available) 15247 1726867257.86330: exiting _queue_task() for managed_node2/gather_facts 15247 1726867257.86341: done queuing things up, now waiting for results queue to drain 15247 1726867257.86343: waiting for pending results... 15247 1726867257.86633: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15247 1726867257.86762: in run() - task 0affcac9-a3a5-8ce3-1923-000000000382 15247 1726867257.86791: variable 'ansible_search_path' from source: unknown 15247 1726867257.86836: calling self._execute() 15247 1726867257.86949: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867257.86963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867257.86981: variable 'omit' from source: magic vars 15247 1726867257.87409: variable 'ansible_distribution_major_version' from source: facts 15247 1726867257.87694: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867257.87697: variable 'omit' from source: magic vars 15247 1726867257.87700: variable 'omit' from source: magic vars 15247 1726867257.87702: variable 'omit' from source: magic vars 15247 1726867257.87747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867257.87792: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867257.87986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867257.87990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867257.87993: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867257.88006: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867257.88020: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867257.88029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867257.88347: Set connection var ansible_shell_executable to /bin/sh 15247 1726867257.88350: Set connection var ansible_connection to ssh 15247 1726867257.88353: Set connection var ansible_shell_type to sh 15247 1726867257.88355: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867257.88357: Set connection var ansible_timeout to 10 15247 1726867257.88359: Set connection var ansible_pipelining to False 15247 1726867257.88565: variable 'ansible_shell_executable' from source: unknown 15247 1726867257.88568: variable 'ansible_connection' from source: unknown 15247 1726867257.88570: variable 'ansible_module_compression' from source: unknown 15247 1726867257.88572: variable 'ansible_shell_type' from source: unknown 15247 1726867257.88574: variable 'ansible_shell_executable' from source: unknown 15247 1726867257.88576: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867257.88580: variable 'ansible_pipelining' from source: unknown 15247 1726867257.88582: variable 'ansible_timeout' from source: unknown 15247 1726867257.88584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867257.88849: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867257.88898: variable 'omit' from source: magic vars 15247 1726867257.88909: starting attempt loop 15247 1726867257.88919: running the handler 15247 1726867257.88959: variable 'ansible_facts' from source: unknown 15247 1726867257.88986: _low_level_execute_command(): starting 15247 1726867257.89001: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867257.89803: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867257.89839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867257.89852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867257.89870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867257.89938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867257.91637: stdout chunk (state=3): >>>/root <<< 15247 1726867257.91742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867257.91775: stderr chunk (state=3): >>><<< 15247 1726867257.91845: stdout chunk (state=3): >>><<< 15247 1726867257.91872: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867257.91887: _low_level_execute_command(): starting 15247 1726867257.91891: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867257.918723-16528-241062375239840 `" && echo ansible-tmp-1726867257.918723-16528-241062375239840="` echo /root/.ansible/tmp/ansible-tmp-1726867257.918723-16528-241062375239840 `" ) && sleep 0' 15247 1726867257.92927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867257.92947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867257.93046: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867257.93073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867257.93143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867257.95200: stdout chunk (state=3): >>>ansible-tmp-1726867257.918723-16528-241062375239840=/root/.ansible/tmp/ansible-tmp-1726867257.918723-16528-241062375239840 <<< 15247 1726867257.95238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867257.95242: stdout chunk (state=3): >>><<< 15247 1726867257.95248: stderr chunk (state=3): >>><<< 15247 1726867257.95267: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867257.918723-16528-241062375239840=/root/.ansible/tmp/ansible-tmp-1726867257.918723-16528-241062375239840 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867257.95305: variable 'ansible_module_compression' from source: unknown 15247 1726867257.95346: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15247 1726867257.95412: variable 'ansible_facts' from source: unknown 15247 1726867257.95852: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867257.918723-16528-241062375239840/AnsiballZ_setup.py 15247 1726867257.96130: Sending initial data 15247 1726867257.96133: Sent initial data (153 bytes) 15247 1726867257.96788: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867257.96791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867257.96793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867257.96795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867257.96797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867257.96799: stderr chunk (state=3): >>>debug2: match not found <<< 15247 1726867257.96801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867257.96803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15247 1726867257.96805: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867257.96839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867257.96842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867257.96893: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867257.97150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867257.98781: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15247 1726867257.98785: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15247 1726867257.98787: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 15247 1726867257.98791: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 15247 1726867257.98794: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 15247 1726867257.98796: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 15247 1726867257.98798: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 15247 1726867257.98800: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 15247 1726867257.98802: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15247 1726867257.98803: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 <<< 15247 1726867257.98805: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" <<< 15247 1726867257.98807: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867257.98850: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867257.98897: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp8okgv8qp /root/.ansible/tmp/ansible-tmp-1726867257.918723-16528-241062375239840/AnsiballZ_setup.py <<< 15247 1726867257.98900: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867257.918723-16528-241062375239840/AnsiballZ_setup.py" <<< 15247 1726867257.98955: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp8okgv8qp" to remote "/root/.ansible/tmp/ansible-tmp-1726867257.918723-16528-241062375239840/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867257.918723-16528-241062375239840/AnsiballZ_setup.py" <<< 15247 1726867258.02042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867258.02046: stdout chunk (state=3): >>><<< 15247 1726867258.02048: stderr chunk (state=3): >>><<< 15247 1726867258.02056: done transferring module to remote 15247 1726867258.02060: _low_level_execute_command(): starting 15247 1726867258.02062: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867257.918723-16528-241062375239840/ /root/.ansible/tmp/ansible-tmp-1726867257.918723-16528-241062375239840/AnsiballZ_setup.py && sleep 0' 15247 1726867258.03076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867258.03082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 15247 1726867258.03084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867258.03090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867258.03137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867258.03140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867258.03508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867258.03591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867258.05410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867258.05417: stdout chunk (state=3): >>><<< 15247 1726867258.05419: stderr chunk (state=3): >>><<< 15247 1726867258.05445: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867258.05448: _low_level_execute_command(): starting 15247 1726867258.05451: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867257.918723-16528-241062375239840/AnsiballZ_setup.py && sleep 0' 15247 1726867258.06757: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867258.06760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867258.06762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867258.06765: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867258.06767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867258.06769: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867258.06863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867258.06866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867258.07220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867258.67892: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "20", "second": "58", "epoch": "1726867258", "epoch_int": "1726867258", "date": "2024-09-20", "time": "17:20:58", "iso8601_micro": "2024-09-20T21:20:58.349352Z", "iso8601": "2024-09-20T21:20:58Z", "iso8601_basic": "20240920T172058349352", "iso8601_basic_short": "20240920T172058", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.54833984375, "5m": 0.38525390625, "15m": 0.19189453125}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on<<< 15247 1726867258.67923: stdout chunk (state=3): >>> [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert"<<< 15247 1726867258.67956: stdout chunk (state=3): >>>: "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2957, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 574, "free": 2957}, "nocache": {"free": 3294, "used": 237}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 496, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796945920, "block_size": 4096, "block_total": 65519099, "block_available": 63915270, "block_used": 1603829, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15247 1726867258.70089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867258.70093: stdout chunk (state=3): >>><<< 15247 1726867258.70095: stderr chunk (state=3): >>><<< 15247 1726867258.70101: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "20", "second": "58", "epoch": "1726867258", "epoch_int": "1726867258", "date": "2024-09-20", "time": "17:20:58", "iso8601_micro": "2024-09-20T21:20:58.349352Z", "iso8601": "2024-09-20T21:20:58Z", "iso8601_basic": "20240920T172058349352", "iso8601_basic_short": "20240920T172058", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_loadavg": {"1m": 0.54833984375, "5m": 0.38525390625, "15m": 0.19189453125}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2957, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 574, "free": 2957}, "nocache": {"free": 3294, "used": 237}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 496, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796945920, "block_size": 4096, "block_total": 65519099, "block_available": 63915270, "block_used": 1603829, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867258.70676: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867257.918723-16528-241062375239840/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867258.70682: _low_level_execute_command(): starting 15247 1726867258.70684: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867257.918723-16528-241062375239840/ > /dev/null 2>&1 && sleep 0' 15247 1726867258.71525: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867258.71552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867258.71662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867258.71824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867258.71868: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867258.72018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867258.73828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867258.73899: stderr chunk (state=3): >>><<< 15247 1726867258.73934: stdout chunk (state=3): >>><<< 15247 1726867258.74202: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867258.74206: handler run complete 15247 1726867258.74208: variable 'ansible_facts' from source: unknown 15247 1726867258.74281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867258.74638: variable 'ansible_facts' from source: unknown 15247 1726867258.74738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867258.74927: attempt loop complete, returning result 15247 1726867258.74935: _execute() done 15247 1726867258.74942: dumping result to json 15247 1726867258.74989: done dumping result, returning 15247 1726867258.75003: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-8ce3-1923-000000000382] 15247 1726867258.75017: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000382 ok: [managed_node2] 15247 1726867258.76366: no more pending results, returning what we have 15247 1726867258.76369: results queue empty 15247 1726867258.76370: checking for any_errors_fatal 15247 1726867258.76372: done checking for any_errors_fatal 15247 1726867258.76372: checking for max_fail_percentage 15247 1726867258.76374: done checking for max_fail_percentage 15247 1726867258.76375: checking to see if all hosts have failed and the running result is not ok 15247 1726867258.76376: done checking to see if all hosts have failed 15247 1726867258.76376: getting the remaining hosts for this loop 15247 1726867258.76476: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000382 15247 1726867258.76495: WORKER PROCESS EXITING 15247 1726867258.76483: done getting the remaining hosts for this loop 15247 1726867258.76525: getting the next task for host managed_node2 15247 1726867258.76531: done getting next task for host managed_node2 15247 1726867258.76533: ^ task is: TASK: meta (flush_handlers) 15247 1726867258.76534: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867258.76539: getting variables 15247 1726867258.76540: in VariableManager get_vars() 15247 1726867258.76562: Calling all_inventory to load vars for managed_node2 15247 1726867258.76565: Calling groups_inventory to load vars for managed_node2 15247 1726867258.76568: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867258.76580: Calling all_plugins_play to load vars for managed_node2 15247 1726867258.76584: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867258.76587: Calling groups_plugins_play to load vars for managed_node2 15247 1726867258.78823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867258.81740: done with get_vars() 15247 1726867258.81767: done getting variables 15247 1726867258.82192: in VariableManager get_vars() 15247 1726867258.82203: Calling all_inventory to load vars for managed_node2 15247 1726867258.82206: Calling groups_inventory to load vars for managed_node2 15247 1726867258.82208: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867258.82217: Calling all_plugins_play to load vars for managed_node2 15247 1726867258.82220: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867258.82225: Calling groups_plugins_play to load vars for managed_node2 15247 1726867258.84118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867258.86890: done with get_vars() 15247 1726867258.86917: done queuing things up, now waiting for results queue to drain 15247 1726867258.86923: results queue empty 15247 1726867258.86924: checking for any_errors_fatal 15247 1726867258.86928: done checking for any_errors_fatal 15247 1726867258.86928: checking for max_fail_percentage 15247 1726867258.86929: done checking for max_fail_percentage 15247 1726867258.86930: checking to see if all hosts have failed and the running result is not ok 15247 1726867258.86938: done checking to see if all hosts have failed 15247 1726867258.86939: getting the remaining hosts for this loop 15247 1726867258.86940: done getting the remaining hosts for this loop 15247 1726867258.86942: getting the next task for host managed_node2 15247 1726867258.86946: done getting next task for host managed_node2 15247 1726867258.86953: ^ task is: TASK: Include the task 'delete_interface.yml' 15247 1726867258.86954: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867258.86957: getting variables 15247 1726867258.86958: in VariableManager get_vars() 15247 1726867258.86967: Calling all_inventory to load vars for managed_node2 15247 1726867258.86969: Calling groups_inventory to load vars for managed_node2 15247 1726867258.86971: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867258.86976: Calling all_plugins_play to load vars for managed_node2 15247 1726867258.86981: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867258.86984: Calling groups_plugins_play to load vars for managed_node2 15247 1726867258.89960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867258.91234: done with get_vars() 15247 1726867258.91248: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 17:20:58 -0400 (0:00:01.053) 0:00:28.622 ****** 15247 1726867258.91343: entering _queue_task() for managed_node2/include_tasks 15247 1726867258.91774: worker is 1 (out of 1 available) 15247 1726867258.91789: exiting _queue_task() for managed_node2/include_tasks 15247 1726867258.91802: done queuing things up, now waiting for results queue to drain 15247 1726867258.91803: waiting for pending results... 15247 1726867258.92112: running TaskExecutor() for managed_node2/TASK: Include the task 'delete_interface.yml' 15247 1726867258.92229: in run() - task 0affcac9-a3a5-8ce3-1923-000000000052 15247 1726867258.92250: variable 'ansible_search_path' from source: unknown 15247 1726867258.92272: calling self._execute() 15247 1726867258.92359: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867258.92363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867258.92367: variable 'omit' from source: magic vars 15247 1726867258.92726: variable 'ansible_distribution_major_version' from source: facts 15247 1726867258.92797: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867258.92802: _execute() done 15247 1726867258.92804: dumping result to json 15247 1726867258.92806: done dumping result, returning 15247 1726867258.92808: done running TaskExecutor() for managed_node2/TASK: Include the task 'delete_interface.yml' [0affcac9-a3a5-8ce3-1923-000000000052] 15247 1726867258.92809: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000052 15247 1726867258.92886: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000052 15247 1726867258.92889: WORKER PROCESS EXITING 15247 1726867258.92920: no more pending results, returning what we have 15247 1726867258.92926: in VariableManager get_vars() 15247 1726867258.92958: Calling all_inventory to load vars for managed_node2 15247 1726867258.92961: Calling groups_inventory to load vars for managed_node2 15247 1726867258.92964: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867258.92978: Calling all_plugins_play to load vars for managed_node2 15247 1726867258.92982: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867258.92984: Calling groups_plugins_play to load vars for managed_node2 15247 1726867258.96059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867258.99947: done with get_vars() 15247 1726867258.99975: variable 'ansible_search_path' from source: unknown 15247 1726867259.00079: we have included files to process 15247 1726867259.00081: generating all_blocks data 15247 1726867259.00083: done generating all_blocks data 15247 1726867259.00085: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15247 1726867259.00087: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15247 1726867259.00091: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 15247 1726867259.00539: done processing included file 15247 1726867259.00541: iterating over new_blocks loaded from include file 15247 1726867259.00543: in VariableManager get_vars() 15247 1726867259.00672: done with get_vars() 15247 1726867259.00674: filtering new block on tags 15247 1726867259.00708: done filtering new block on tags 15247 1726867259.00711: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 15247 1726867259.00716: extending task lists for all hosts with included blocks 15247 1726867259.00751: done extending task lists 15247 1726867259.00753: done processing included files 15247 1726867259.00753: results queue empty 15247 1726867259.00754: checking for any_errors_fatal 15247 1726867259.00756: done checking for any_errors_fatal 15247 1726867259.00757: checking for max_fail_percentage 15247 1726867259.00757: done checking for max_fail_percentage 15247 1726867259.00758: checking to see if all hosts have failed and the running result is not ok 15247 1726867259.00759: done checking to see if all hosts have failed 15247 1726867259.00760: getting the remaining hosts for this loop 15247 1726867259.00761: done getting the remaining hosts for this loop 15247 1726867259.00763: getting the next task for host managed_node2 15247 1726867259.00790: done getting next task for host managed_node2 15247 1726867259.00793: ^ task is: TASK: Remove test interface if necessary 15247 1726867259.00796: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867259.00798: getting variables 15247 1726867259.00799: in VariableManager get_vars() 15247 1726867259.00809: Calling all_inventory to load vars for managed_node2 15247 1726867259.00812: Calling groups_inventory to load vars for managed_node2 15247 1726867259.00814: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867259.00820: Calling all_plugins_play to load vars for managed_node2 15247 1726867259.00823: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867259.00826: Calling groups_plugins_play to load vars for managed_node2 15247 1726867259.02414: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867259.04900: done with get_vars() 15247 1726867259.04931: done getting variables 15247 1726867259.04971: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 17:20:59 -0400 (0:00:00.136) 0:00:28.759 ****** 15247 1726867259.05007: entering _queue_task() for managed_node2/command 15247 1726867259.05462: worker is 1 (out of 1 available) 15247 1726867259.05518: exiting _queue_task() for managed_node2/command 15247 1726867259.05530: done queuing things up, now waiting for results queue to drain 15247 1726867259.05531: waiting for pending results... 15247 1726867259.06010: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 15247 1726867259.06015: in run() - task 0affcac9-a3a5-8ce3-1923-000000000393 15247 1726867259.06018: variable 'ansible_search_path' from source: unknown 15247 1726867259.06020: variable 'ansible_search_path' from source: unknown 15247 1726867259.06023: calling self._execute() 15247 1726867259.06115: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867259.06138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867259.06183: variable 'omit' from source: magic vars 15247 1726867259.06575: variable 'ansible_distribution_major_version' from source: facts 15247 1726867259.06595: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867259.06607: variable 'omit' from source: magic vars 15247 1726867259.06668: variable 'omit' from source: magic vars 15247 1726867259.06781: variable 'interface' from source: set_fact 15247 1726867259.06863: variable 'omit' from source: magic vars 15247 1726867259.06867: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867259.06900: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867259.06925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867259.06948: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867259.06970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867259.07010: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867259.07018: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867259.07026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867259.07142: Set connection var ansible_shell_executable to /bin/sh 15247 1726867259.07188: Set connection var ansible_connection to ssh 15247 1726867259.07192: Set connection var ansible_shell_type to sh 15247 1726867259.07194: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867259.07196: Set connection var ansible_timeout to 10 15247 1726867259.07198: Set connection var ansible_pipelining to False 15247 1726867259.07224: variable 'ansible_shell_executable' from source: unknown 15247 1726867259.07233: variable 'ansible_connection' from source: unknown 15247 1726867259.07241: variable 'ansible_module_compression' from source: unknown 15247 1726867259.07297: variable 'ansible_shell_type' from source: unknown 15247 1726867259.07300: variable 'ansible_shell_executable' from source: unknown 15247 1726867259.07302: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867259.07304: variable 'ansible_pipelining' from source: unknown 15247 1726867259.07306: variable 'ansible_timeout' from source: unknown 15247 1726867259.07307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867259.07424: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867259.07448: variable 'omit' from source: magic vars 15247 1726867259.07458: starting attempt loop 15247 1726867259.07465: running the handler 15247 1726867259.07487: _low_level_execute_command(): starting 15247 1726867259.07547: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867259.08352: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867259.08415: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867259.08501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15247 1726867259.08522: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867259.08590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867259.08695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867259.08769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867259.10511: stdout chunk (state=3): >>>/root <<< 15247 1726867259.10681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867259.10853: stderr chunk (state=3): >>><<< 15247 1726867259.10856: stdout chunk (state=3): >>><<< 15247 1726867259.11106: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867259.11110: _low_level_execute_command(): starting 15247 1726867259.11112: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867259.10969-16585-120053458130144 `" && echo ansible-tmp-1726867259.10969-16585-120053458130144="` echo /root/.ansible/tmp/ansible-tmp-1726867259.10969-16585-120053458130144 `" ) && sleep 0' 15247 1726867259.12122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867259.12491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867259.12598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867259.12684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867259.14630: stdout chunk (state=3): >>>ansible-tmp-1726867259.10969-16585-120053458130144=/root/.ansible/tmp/ansible-tmp-1726867259.10969-16585-120053458130144 <<< 15247 1726867259.14733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867259.14882: stderr chunk (state=3): >>><<< 15247 1726867259.14887: stdout chunk (state=3): >>><<< 15247 1726867259.14907: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867259.10969-16585-120053458130144=/root/.ansible/tmp/ansible-tmp-1726867259.10969-16585-120053458130144 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867259.14941: variable 'ansible_module_compression' from source: unknown 15247 1726867259.15110: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15247 1726867259.15151: variable 'ansible_facts' from source: unknown 15247 1726867259.15339: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867259.10969-16585-120053458130144/AnsiballZ_command.py 15247 1726867259.15846: Sending initial data 15247 1726867259.15862: Sent initial data (154 bytes) 15247 1726867259.16949: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867259.17289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867259.17296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867259.17408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867259.18996: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15247 1726867259.19005: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 15247 1726867259.19015: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 15247 1726867259.19037: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 15247 1726867259.19086: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 15247 1726867259.19093: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 15247 1726867259.19097: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867259.19110: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867259.19151: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpofl0an31 /root/.ansible/tmp/ansible-tmp-1726867259.10969-16585-120053458130144/AnsiballZ_command.py <<< 15247 1726867259.19254: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867259.10969-16585-120053458130144/AnsiballZ_command.py" <<< 15247 1726867259.19320: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpofl0an31" to remote "/root/.ansible/tmp/ansible-tmp-1726867259.10969-16585-120053458130144/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867259.10969-16585-120053458130144/AnsiballZ_command.py" <<< 15247 1726867259.20618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867259.20621: stdout chunk (state=3): >>><<< 15247 1726867259.20672: stderr chunk (state=3): >>><<< 15247 1726867259.20781: done transferring module to remote 15247 1726867259.20792: _low_level_execute_command(): starting 15247 1726867259.20797: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867259.10969-16585-120053458130144/ /root/.ansible/tmp/ansible-tmp-1726867259.10969-16585-120053458130144/AnsiballZ_command.py && sleep 0' 15247 1726867259.22028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867259.22083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867259.22185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867259.22325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867259.22349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867259.24166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867259.24284: stderr chunk (state=3): >>><<< 15247 1726867259.24291: stdout chunk (state=3): >>><<< 15247 1726867259.24421: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867259.24425: _low_level_execute_command(): starting 15247 1726867259.24428: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867259.10969-16585-120053458130144/AnsiballZ_command.py && sleep 0' 15247 1726867259.25541: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867259.25544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867259.25547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867259.25550: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867259.25552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867259.25554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867259.25557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867259.25652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867259.25731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867259.41713: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-20 17:20:59.408647", "end": "2024-09-20 17:20:59.416084", "delta": "0:00:00.007437", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15247 1726867259.43324: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.12.116 closed. <<< 15247 1726867259.43327: stdout chunk (state=3): >>><<< 15247 1726867259.43330: stderr chunk (state=3): >>><<< 15247 1726867259.43360: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-20 17:20:59.408647", "end": "2024-09-20 17:20:59.416084", "delta": "0:00:00.007437", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.12.116 closed. 15247 1726867259.43460: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867259.10969-16585-120053458130144/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867259.43463: _low_level_execute_command(): starting 15247 1726867259.43465: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867259.10969-16585-120053458130144/ > /dev/null 2>&1 && sleep 0' 15247 1726867259.44069: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867259.44121: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867259.44197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867259.44244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867259.44247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867259.44312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867259.46482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867259.46486: stdout chunk (state=3): >>><<< 15247 1726867259.46488: stderr chunk (state=3): >>><<< 15247 1726867259.46490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867259.46492: handler run complete 15247 1726867259.46495: Evaluated conditional (False): False 15247 1726867259.46496: attempt loop complete, returning result 15247 1726867259.46498: _execute() done 15247 1726867259.46501: dumping result to json 15247 1726867259.46503: done dumping result, returning 15247 1726867259.46506: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [0affcac9-a3a5-8ce3-1923-000000000393] 15247 1726867259.46508: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000393 15247 1726867259.46662: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000393 fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "LSR-TST-br31" ], "delta": "0:00:00.007437", "end": "2024-09-20 17:20:59.416084", "rc": 1, "start": "2024-09-20 17:20:59.408647" } STDERR: Cannot find device "LSR-TST-br31" MSG: non-zero return code ...ignoring 15247 1726867259.46741: no more pending results, returning what we have 15247 1726867259.46745: results queue empty 15247 1726867259.46746: checking for any_errors_fatal 15247 1726867259.46748: done checking for any_errors_fatal 15247 1726867259.46748: checking for max_fail_percentage 15247 1726867259.46750: done checking for max_fail_percentage 15247 1726867259.46751: checking to see if all hosts have failed and the running result is not ok 15247 1726867259.46752: done checking to see if all hosts have failed 15247 1726867259.46752: getting the remaining hosts for this loop 15247 1726867259.46753: done getting the remaining hosts for this loop 15247 1726867259.46757: getting the next task for host managed_node2 15247 1726867259.46765: done getting next task for host managed_node2 15247 1726867259.46767: ^ task is: TASK: meta (flush_handlers) 15247 1726867259.46769: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867259.46773: getting variables 15247 1726867259.46775: in VariableManager get_vars() 15247 1726867259.46805: Calling all_inventory to load vars for managed_node2 15247 1726867259.46807: Calling groups_inventory to load vars for managed_node2 15247 1726867259.46810: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867259.46939: Calling all_plugins_play to load vars for managed_node2 15247 1726867259.46943: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867259.46948: Calling groups_plugins_play to load vars for managed_node2 15247 1726867259.47730: WORKER PROCESS EXITING 15247 1726867259.49157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867259.51243: done with get_vars() 15247 1726867259.51267: done getting variables 15247 1726867259.51343: in VariableManager get_vars() 15247 1726867259.51352: Calling all_inventory to load vars for managed_node2 15247 1726867259.51355: Calling groups_inventory to load vars for managed_node2 15247 1726867259.51357: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867259.51361: Calling all_plugins_play to load vars for managed_node2 15247 1726867259.51364: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867259.51367: Calling groups_plugins_play to load vars for managed_node2 15247 1726867259.52929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867259.54800: done with get_vars() 15247 1726867259.54831: done queuing things up, now waiting for results queue to drain 15247 1726867259.54833: results queue empty 15247 1726867259.54834: checking for any_errors_fatal 15247 1726867259.54837: done checking for any_errors_fatal 15247 1726867259.54838: checking for max_fail_percentage 15247 1726867259.54839: done checking for max_fail_percentage 15247 1726867259.54839: checking to see if all hosts have failed and the running result is not ok 15247 1726867259.54840: done checking to see if all hosts have failed 15247 1726867259.54841: getting the remaining hosts for this loop 15247 1726867259.54842: done getting the remaining hosts for this loop 15247 1726867259.54844: getting the next task for host managed_node2 15247 1726867259.54848: done getting next task for host managed_node2 15247 1726867259.54850: ^ task is: TASK: meta (flush_handlers) 15247 1726867259.54851: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867259.54854: getting variables 15247 1726867259.54855: in VariableManager get_vars() 15247 1726867259.54864: Calling all_inventory to load vars for managed_node2 15247 1726867259.54866: Calling groups_inventory to load vars for managed_node2 15247 1726867259.54868: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867259.54872: Calling all_plugins_play to load vars for managed_node2 15247 1726867259.54876: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867259.54880: Calling groups_plugins_play to load vars for managed_node2 15247 1726867259.56538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867259.58814: done with get_vars() 15247 1726867259.58835: done getting variables 15247 1726867259.58887: in VariableManager get_vars() 15247 1726867259.58896: Calling all_inventory to load vars for managed_node2 15247 1726867259.58898: Calling groups_inventory to load vars for managed_node2 15247 1726867259.58900: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867259.58905: Calling all_plugins_play to load vars for managed_node2 15247 1726867259.58907: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867259.58909: Calling groups_plugins_play to load vars for managed_node2 15247 1726867259.61164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867259.63598: done with get_vars() 15247 1726867259.63629: done queuing things up, now waiting for results queue to drain 15247 1726867259.63631: results queue empty 15247 1726867259.63632: checking for any_errors_fatal 15247 1726867259.63634: done checking for any_errors_fatal 15247 1726867259.63634: checking for max_fail_percentage 15247 1726867259.63635: done checking for max_fail_percentage 15247 1726867259.63636: checking to see if all hosts have failed and the running result is not ok 15247 1726867259.63637: done checking to see if all hosts have failed 15247 1726867259.63637: getting the remaining hosts for this loop 15247 1726867259.63638: done getting the remaining hosts for this loop 15247 1726867259.63641: getting the next task for host managed_node2 15247 1726867259.63645: done getting next task for host managed_node2 15247 1726867259.63645: ^ task is: None 15247 1726867259.63647: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867259.63648: done queuing things up, now waiting for results queue to drain 15247 1726867259.63649: results queue empty 15247 1726867259.63650: checking for any_errors_fatal 15247 1726867259.63650: done checking for any_errors_fatal 15247 1726867259.63651: checking for max_fail_percentage 15247 1726867259.63652: done checking for max_fail_percentage 15247 1726867259.63653: checking to see if all hosts have failed and the running result is not ok 15247 1726867259.63653: done checking to see if all hosts have failed 15247 1726867259.63654: getting the next task for host managed_node2 15247 1726867259.63658: done getting next task for host managed_node2 15247 1726867259.63659: ^ task is: None 15247 1726867259.63660: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867259.63713: in VariableManager get_vars() 15247 1726867259.63735: done with get_vars() 15247 1726867259.63741: in VariableManager get_vars() 15247 1726867259.63753: done with get_vars() 15247 1726867259.63757: variable 'omit' from source: magic vars 15247 1726867259.63885: variable 'profile' from source: play vars 15247 1726867259.64085: in VariableManager get_vars() 15247 1726867259.64102: done with get_vars() 15247 1726867259.64124: variable 'omit' from source: magic vars 15247 1726867259.64384: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 15247 1726867259.66824: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15247 1726867259.66991: getting the remaining hosts for this loop 15247 1726867259.66993: done getting the remaining hosts for this loop 15247 1726867259.66996: getting the next task for host managed_node2 15247 1726867259.66998: done getting next task for host managed_node2 15247 1726867259.67000: ^ task is: TASK: Gathering Facts 15247 1726867259.67002: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867259.67004: getting variables 15247 1726867259.67005: in VariableManager get_vars() 15247 1726867259.67019: Calling all_inventory to load vars for managed_node2 15247 1726867259.67022: Calling groups_inventory to load vars for managed_node2 15247 1726867259.67024: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867259.67029: Calling all_plugins_play to load vars for managed_node2 15247 1726867259.67032: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867259.67034: Calling groups_plugins_play to load vars for managed_node2 15247 1726867259.69852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867259.74255: done with get_vars() 15247 1726867259.74280: done getting variables 15247 1726867259.74472: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 17:20:59 -0400 (0:00:00.695) 0:00:29.454 ****** 15247 1726867259.74531: entering _queue_task() for managed_node2/gather_facts 15247 1726867259.75027: worker is 1 (out of 1 available) 15247 1726867259.75040: exiting _queue_task() for managed_node2/gather_facts 15247 1726867259.75052: done queuing things up, now waiting for results queue to drain 15247 1726867259.75053: waiting for pending results... 15247 1726867259.75443: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15247 1726867259.75631: in run() - task 0affcac9-a3a5-8ce3-1923-0000000003a1 15247 1726867259.75654: variable 'ansible_search_path' from source: unknown 15247 1726867259.75701: calling self._execute() 15247 1726867259.75809: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867259.75830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867259.75854: variable 'omit' from source: magic vars 15247 1726867259.76358: variable 'ansible_distribution_major_version' from source: facts 15247 1726867259.76476: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867259.76481: variable 'omit' from source: magic vars 15247 1726867259.76484: variable 'omit' from source: magic vars 15247 1726867259.76493: variable 'omit' from source: magic vars 15247 1726867259.76537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867259.76658: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867259.76670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867259.76695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867259.76721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867259.76759: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867259.76775: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867259.76825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867259.76908: Set connection var ansible_shell_executable to /bin/sh 15247 1726867259.76921: Set connection var ansible_connection to ssh 15247 1726867259.76937: Set connection var ansible_shell_type to sh 15247 1726867259.76948: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867259.76960: Set connection var ansible_timeout to 10 15247 1726867259.76971: Set connection var ansible_pipelining to False 15247 1726867259.77041: variable 'ansible_shell_executable' from source: unknown 15247 1726867259.77045: variable 'ansible_connection' from source: unknown 15247 1726867259.77047: variable 'ansible_module_compression' from source: unknown 15247 1726867259.77050: variable 'ansible_shell_type' from source: unknown 15247 1726867259.77052: variable 'ansible_shell_executable' from source: unknown 15247 1726867259.77054: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867259.77056: variable 'ansible_pipelining' from source: unknown 15247 1726867259.77059: variable 'ansible_timeout' from source: unknown 15247 1726867259.77061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867259.77683: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867259.77687: variable 'omit' from source: magic vars 15247 1726867259.77690: starting attempt loop 15247 1726867259.77692: running the handler 15247 1726867259.77694: variable 'ansible_facts' from source: unknown 15247 1726867259.77696: _low_level_execute_command(): starting 15247 1726867259.77698: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867259.78782: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867259.78795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867259.78811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867259.78840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867259.78951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867259.78963: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867259.78982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867259.79064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867259.80761: stdout chunk (state=3): >>>/root <<< 15247 1726867259.80886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867259.81207: stdout chunk (state=3): >>><<< 15247 1726867259.81210: stderr chunk (state=3): >>><<< 15247 1726867259.81217: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867259.81220: _low_level_execute_command(): starting 15247 1726867259.81222: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867259.8111937-16615-257526269607501 `" && echo ansible-tmp-1726867259.8111937-16615-257526269607501="` echo /root/.ansible/tmp/ansible-tmp-1726867259.8111937-16615-257526269607501 `" ) && sleep 0' 15247 1726867259.82351: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867259.82494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867259.82683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867259.82706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867259.82782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867259.84737: stdout chunk (state=3): >>>ansible-tmp-1726867259.8111937-16615-257526269607501=/root/.ansible/tmp/ansible-tmp-1726867259.8111937-16615-257526269607501 <<< 15247 1726867259.84859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867259.84870: stdout chunk (state=3): >>><<< 15247 1726867259.84891: stderr chunk (state=3): >>><<< 15247 1726867259.84974: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867259.8111937-16615-257526269607501=/root/.ansible/tmp/ansible-tmp-1726867259.8111937-16615-257526269607501 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867259.85085: variable 'ansible_module_compression' from source: unknown 15247 1726867259.85185: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15247 1726867259.85392: variable 'ansible_facts' from source: unknown 15247 1726867259.86220: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867259.8111937-16615-257526269607501/AnsiballZ_setup.py 15247 1726867259.86896: Sending initial data 15247 1726867259.86900: Sent initial data (154 bytes) 15247 1726867259.88058: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867259.88266: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867259.88388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867259.88458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867259.90119: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867259.90221: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867259.90266: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpmpvsz_qx /root/.ansible/tmp/ansible-tmp-1726867259.8111937-16615-257526269607501/AnsiballZ_setup.py <<< 15247 1726867259.90273: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867259.8111937-16615-257526269607501/AnsiballZ_setup.py" <<< 15247 1726867259.90304: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpmpvsz_qx" to remote "/root/.ansible/tmp/ansible-tmp-1726867259.8111937-16615-257526269607501/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867259.8111937-16615-257526269607501/AnsiballZ_setup.py" <<< 15247 1726867259.94285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867259.94298: stdout chunk (state=3): >>><<< 15247 1726867259.94321: stderr chunk (state=3): >>><<< 15247 1726867259.94485: done transferring module to remote 15247 1726867259.94488: _low_level_execute_command(): starting 15247 1726867259.94491: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867259.8111937-16615-257526269607501/ /root/.ansible/tmp/ansible-tmp-1726867259.8111937-16615-257526269607501/AnsiballZ_setup.py && sleep 0' 15247 1726867259.95753: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867259.95866: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867259.96472: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867259.96534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867259.96865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867259.98653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867259.98656: stdout chunk (state=3): >>><<< 15247 1726867259.98658: stderr chunk (state=3): >>><<< 15247 1726867259.98661: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867259.98667: _low_level_execute_command(): starting 15247 1726867259.98669: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867259.8111937-16615-257526269607501/AnsiballZ_setup.py && sleep 0' 15247 1726867259.99796: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867259.99824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15247 1726867259.99925: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867259.99940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867259.99960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867260.00146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867260.62353: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "00", "epoch": "1726867260", "epoch_int": "1726867260", "date": "2024-09-20", "time": "17:21:00", "iso8601_micro": "2024-09-20T21:21:00.270638Z", "iso8601": "2024-09-20T21:21:00Z", "iso8601_basic": "20240920T172100270638", "iso8601_basic_short": "20240920T172100", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_loadavg": {"1m": 0.54833984375, "5m": 0.38525390625, "15m": 0.19189453125}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2943, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 588, "free": 2943}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 498, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796945920, "block_size": 4096, "block_total": 65519099, "block_available": 63915270, "block_used": 1603829, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15247 1726867260.64406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867260.64419: stderr chunk (state=3): >>>Shared connection to 10.31.12.116 closed. <<< 15247 1726867260.64469: stderr chunk (state=3): >>><<< 15247 1726867260.64684: stdout chunk (state=3): >>><<< 15247 1726867260.64688: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_iscsi_iqn": "", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "00", "epoch": "1726867260", "epoch_int": "1726867260", "date": "2024-09-20", "time": "17:21:00", "iso8601_micro": "2024-09-20T21:21:00.270638Z", "iso8601": "2024-09-20T21:21:00Z", "iso8601_basic": "20240920T172100270638", "iso8601_basic_short": "20240920T172100", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_loadavg": {"1m": 0.54833984375, "5m": 0.38525390625, "15m": 0.19189453125}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2943, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 588, "free": 2943}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 498, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796945920, "block_size": 4096, "block_total": 65519099, "block_available": 63915270, "block_used": 1603829, "inode_total": 131070960, "inode_available": 131029050, "inode_used": 41910, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867260.65276: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867259.8111937-16615-257526269607501/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867260.65573: _low_level_execute_command(): starting 15247 1726867260.65576: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867259.8111937-16615-257526269607501/ > /dev/null 2>&1 && sleep 0' 15247 1726867260.66301: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867260.66331: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867260.66345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867260.66436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867260.66475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867260.66505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867260.66538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867260.66670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867260.69008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867260.69013: stdout chunk (state=3): >>><<< 15247 1726867260.69016: stderr chunk (state=3): >>><<< 15247 1726867260.69018: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867260.69020: handler run complete 15247 1726867260.69080: variable 'ansible_facts' from source: unknown 15247 1726867260.69347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867260.70080: variable 'ansible_facts' from source: unknown 15247 1726867260.70251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867260.70475: attempt loop complete, returning result 15247 1726867260.70487: _execute() done 15247 1726867260.70493: dumping result to json 15247 1726867260.70528: done dumping result, returning 15247 1726867260.70540: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-8ce3-1923-0000000003a1] 15247 1726867260.70550: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000003a1 ok: [managed_node2] 15247 1726867260.71455: no more pending results, returning what we have 15247 1726867260.71458: results queue empty 15247 1726867260.71459: checking for any_errors_fatal 15247 1726867260.71460: done checking for any_errors_fatal 15247 1726867260.71461: checking for max_fail_percentage 15247 1726867260.71462: done checking for max_fail_percentage 15247 1726867260.71463: checking to see if all hosts have failed and the running result is not ok 15247 1726867260.71464: done checking to see if all hosts have failed 15247 1726867260.71464: getting the remaining hosts for this loop 15247 1726867260.71466: done getting the remaining hosts for this loop 15247 1726867260.71469: getting the next task for host managed_node2 15247 1726867260.71474: done getting next task for host managed_node2 15247 1726867260.71475: ^ task is: TASK: meta (flush_handlers) 15247 1726867260.71479: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867260.71483: getting variables 15247 1726867260.71484: in VariableManager get_vars() 15247 1726867260.71600: Calling all_inventory to load vars for managed_node2 15247 1726867260.71721: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000003a1 15247 1726867260.71724: WORKER PROCESS EXITING 15247 1726867260.71727: Calling groups_inventory to load vars for managed_node2 15247 1726867260.71730: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867260.71739: Calling all_plugins_play to load vars for managed_node2 15247 1726867260.71742: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867260.71745: Calling groups_plugins_play to load vars for managed_node2 15247 1726867260.73348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867260.75153: done with get_vars() 15247 1726867260.75176: done getting variables 15247 1726867260.75359: in VariableManager get_vars() 15247 1726867260.75372: Calling all_inventory to load vars for managed_node2 15247 1726867260.75375: Calling groups_inventory to load vars for managed_node2 15247 1726867260.75379: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867260.75384: Calling all_plugins_play to load vars for managed_node2 15247 1726867260.75387: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867260.75390: Calling groups_plugins_play to load vars for managed_node2 15247 1726867260.78151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867260.80445: done with get_vars() 15247 1726867260.80474: done queuing things up, now waiting for results queue to drain 15247 1726867260.80479: results queue empty 15247 1726867260.80480: checking for any_errors_fatal 15247 1726867260.80484: done checking for any_errors_fatal 15247 1726867260.80485: checking for max_fail_percentage 15247 1726867260.80490: done checking for max_fail_percentage 15247 1726867260.80491: checking to see if all hosts have failed and the running result is not ok 15247 1726867260.80491: done checking to see if all hosts have failed 15247 1726867260.80492: getting the remaining hosts for this loop 15247 1726867260.80493: done getting the remaining hosts for this loop 15247 1726867260.80496: getting the next task for host managed_node2 15247 1726867260.80500: done getting next task for host managed_node2 15247 1726867260.80507: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15247 1726867260.80509: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867260.80519: getting variables 15247 1726867260.80520: in VariableManager get_vars() 15247 1726867260.80535: Calling all_inventory to load vars for managed_node2 15247 1726867260.80537: Calling groups_inventory to load vars for managed_node2 15247 1726867260.80539: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867260.80545: Calling all_plugins_play to load vars for managed_node2 15247 1726867260.80547: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867260.80550: Calling groups_plugins_play to load vars for managed_node2 15247 1726867260.81800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867260.83973: done with get_vars() 15247 1726867260.83998: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:21:00 -0400 (0:00:01.095) 0:00:30.550 ****** 15247 1726867260.84076: entering _queue_task() for managed_node2/include_tasks 15247 1726867260.84480: worker is 1 (out of 1 available) 15247 1726867260.84493: exiting _queue_task() for managed_node2/include_tasks 15247 1726867260.84621: done queuing things up, now waiting for results queue to drain 15247 1726867260.84623: waiting for pending results... 15247 1726867260.84824: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 15247 1726867260.84961: in run() - task 0affcac9-a3a5-8ce3-1923-00000000005a 15247 1726867260.84988: variable 'ansible_search_path' from source: unknown 15247 1726867260.84996: variable 'ansible_search_path' from source: unknown 15247 1726867260.85056: calling self._execute() 15247 1726867260.85234: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867260.85254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867260.85276: variable 'omit' from source: magic vars 15247 1726867260.85715: variable 'ansible_distribution_major_version' from source: facts 15247 1726867260.85738: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867260.85751: _execute() done 15247 1726867260.85782: dumping result to json 15247 1726867260.85786: done dumping result, returning 15247 1726867260.85788: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-8ce3-1923-00000000005a] 15247 1726867260.85791: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000005a 15247 1726867260.86072: no more pending results, returning what we have 15247 1726867260.86079: in VariableManager get_vars() 15247 1726867260.86125: Calling all_inventory to load vars for managed_node2 15247 1726867260.86128: Calling groups_inventory to load vars for managed_node2 15247 1726867260.86131: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867260.86145: Calling all_plugins_play to load vars for managed_node2 15247 1726867260.86148: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867260.86152: Calling groups_plugins_play to load vars for managed_node2 15247 1726867260.86794: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000005a 15247 1726867260.86797: WORKER PROCESS EXITING 15247 1726867260.87782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867260.89441: done with get_vars() 15247 1726867260.89475: variable 'ansible_search_path' from source: unknown 15247 1726867260.89480: variable 'ansible_search_path' from source: unknown 15247 1726867260.89512: we have included files to process 15247 1726867260.89513: generating all_blocks data 15247 1726867260.89514: done generating all_blocks data 15247 1726867260.89515: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15247 1726867260.89516: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15247 1726867260.89520: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 15247 1726867260.90131: done processing included file 15247 1726867260.90133: iterating over new_blocks loaded from include file 15247 1726867260.90135: in VariableManager get_vars() 15247 1726867260.90158: done with get_vars() 15247 1726867260.90160: filtering new block on tags 15247 1726867260.90176: done filtering new block on tags 15247 1726867260.90184: in VariableManager get_vars() 15247 1726867260.90204: done with get_vars() 15247 1726867260.90206: filtering new block on tags 15247 1726867260.90225: done filtering new block on tags 15247 1726867260.90228: in VariableManager get_vars() 15247 1726867260.90246: done with get_vars() 15247 1726867260.90247: filtering new block on tags 15247 1726867260.90261: done filtering new block on tags 15247 1726867260.90263: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 15247 1726867260.90268: extending task lists for all hosts with included blocks 15247 1726867260.90734: done extending task lists 15247 1726867260.90736: done processing included files 15247 1726867260.90737: results queue empty 15247 1726867260.90737: checking for any_errors_fatal 15247 1726867260.90739: done checking for any_errors_fatal 15247 1726867260.90740: checking for max_fail_percentage 15247 1726867260.90741: done checking for max_fail_percentage 15247 1726867260.90741: checking to see if all hosts have failed and the running result is not ok 15247 1726867260.90742: done checking to see if all hosts have failed 15247 1726867260.90743: getting the remaining hosts for this loop 15247 1726867260.90744: done getting the remaining hosts for this loop 15247 1726867260.90747: getting the next task for host managed_node2 15247 1726867260.90750: done getting next task for host managed_node2 15247 1726867260.90753: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15247 1726867260.90755: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867260.90764: getting variables 15247 1726867260.90765: in VariableManager get_vars() 15247 1726867260.90803: Calling all_inventory to load vars for managed_node2 15247 1726867260.90805: Calling groups_inventory to load vars for managed_node2 15247 1726867260.90807: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867260.90813: Calling all_plugins_play to load vars for managed_node2 15247 1726867260.90815: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867260.90844: Calling groups_plugins_play to load vars for managed_node2 15247 1726867260.92956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867260.94608: done with get_vars() 15247 1726867260.94751: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:21:00 -0400 (0:00:00.107) 0:00:30.657 ****** 15247 1726867260.94955: entering _queue_task() for managed_node2/setup 15247 1726867260.95667: worker is 1 (out of 1 available) 15247 1726867260.95680: exiting _queue_task() for managed_node2/setup 15247 1726867260.95693: done queuing things up, now waiting for results queue to drain 15247 1726867260.95695: waiting for pending results... 15247 1726867260.96113: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 15247 1726867260.96297: in run() - task 0affcac9-a3a5-8ce3-1923-0000000003e2 15247 1726867260.96323: variable 'ansible_search_path' from source: unknown 15247 1726867260.96331: variable 'ansible_search_path' from source: unknown 15247 1726867260.96373: calling self._execute() 15247 1726867260.96480: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867260.96493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867260.96513: variable 'omit' from source: magic vars 15247 1726867260.96907: variable 'ansible_distribution_major_version' from source: facts 15247 1726867260.96958: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867260.97417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867260.99873: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867260.99985: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867261.00055: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867261.00212: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867261.00217: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867261.00235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867261.00274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867261.00351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867261.00400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867261.00426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867261.00560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867261.00576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867261.00615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867261.00665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867261.00691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867261.00884: variable '__network_required_facts' from source: role '' defaults 15247 1726867261.00919: variable 'ansible_facts' from source: unknown 15247 1726867261.02480: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 15247 1726867261.02485: when evaluation is False, skipping this task 15247 1726867261.02491: _execute() done 15247 1726867261.02495: dumping result to json 15247 1726867261.02497: done dumping result, returning 15247 1726867261.02500: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-8ce3-1923-0000000003e2] 15247 1726867261.02502: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000003e2 15247 1726867261.02880: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000003e2 15247 1726867261.02887: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15247 1726867261.02934: no more pending results, returning what we have 15247 1726867261.02938: results queue empty 15247 1726867261.02939: checking for any_errors_fatal 15247 1726867261.02941: done checking for any_errors_fatal 15247 1726867261.02942: checking for max_fail_percentage 15247 1726867261.02944: done checking for max_fail_percentage 15247 1726867261.02945: checking to see if all hosts have failed and the running result is not ok 15247 1726867261.02946: done checking to see if all hosts have failed 15247 1726867261.02946: getting the remaining hosts for this loop 15247 1726867261.02948: done getting the remaining hosts for this loop 15247 1726867261.02952: getting the next task for host managed_node2 15247 1726867261.02963: done getting next task for host managed_node2 15247 1726867261.02967: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 15247 1726867261.02970: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867261.02986: getting variables 15247 1726867261.02988: in VariableManager get_vars() 15247 1726867261.03032: Calling all_inventory to load vars for managed_node2 15247 1726867261.03035: Calling groups_inventory to load vars for managed_node2 15247 1726867261.03041: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867261.03054: Calling all_plugins_play to load vars for managed_node2 15247 1726867261.03058: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867261.03061: Calling groups_plugins_play to load vars for managed_node2 15247 1726867261.06294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867261.09711: done with get_vars() 15247 1726867261.09744: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:21:01 -0400 (0:00:00.150) 0:00:30.808 ****** 15247 1726867261.09911: entering _queue_task() for managed_node2/stat 15247 1726867261.10420: worker is 1 (out of 1 available) 15247 1726867261.10432: exiting _queue_task() for managed_node2/stat 15247 1726867261.10443: done queuing things up, now waiting for results queue to drain 15247 1726867261.10445: waiting for pending results... 15247 1726867261.10658: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 15247 1726867261.10865: in run() - task 0affcac9-a3a5-8ce3-1923-0000000003e4 15247 1726867261.10869: variable 'ansible_search_path' from source: unknown 15247 1726867261.10872: variable 'ansible_search_path' from source: unknown 15247 1726867261.10875: calling self._execute() 15247 1726867261.10971: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867261.10994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867261.11011: variable 'omit' from source: magic vars 15247 1726867261.11421: variable 'ansible_distribution_major_version' from source: facts 15247 1726867261.11441: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867261.11648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867261.11975: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867261.12029: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867261.12073: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867261.12172: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867261.12212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867261.12243: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867261.12280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867261.12315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867261.12420: variable '__network_is_ostree' from source: set_fact 15247 1726867261.12433: Evaluated conditional (not __network_is_ostree is defined): False 15247 1726867261.12440: when evaluation is False, skipping this task 15247 1726867261.12450: _execute() done 15247 1726867261.12462: dumping result to json 15247 1726867261.12494: done dumping result, returning 15247 1726867261.12500: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-8ce3-1923-0000000003e4] 15247 1726867261.12505: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000003e4 15247 1726867261.12673: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000003e4 15247 1726867261.12680: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15247 1726867261.12749: no more pending results, returning what we have 15247 1726867261.12753: results queue empty 15247 1726867261.12756: checking for any_errors_fatal 15247 1726867261.12767: done checking for any_errors_fatal 15247 1726867261.12768: checking for max_fail_percentage 15247 1726867261.12770: done checking for max_fail_percentage 15247 1726867261.12771: checking to see if all hosts have failed and the running result is not ok 15247 1726867261.12772: done checking to see if all hosts have failed 15247 1726867261.12772: getting the remaining hosts for this loop 15247 1726867261.12774: done getting the remaining hosts for this loop 15247 1726867261.12780: getting the next task for host managed_node2 15247 1726867261.12787: done getting next task for host managed_node2 15247 1726867261.12791: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15247 1726867261.12794: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867261.12814: getting variables 15247 1726867261.12816: in VariableManager get_vars() 15247 1726867261.13052: Calling all_inventory to load vars for managed_node2 15247 1726867261.13056: Calling groups_inventory to load vars for managed_node2 15247 1726867261.13059: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867261.13069: Calling all_plugins_play to load vars for managed_node2 15247 1726867261.13072: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867261.13075: Calling groups_plugins_play to load vars for managed_node2 15247 1726867261.14757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867261.16344: done with get_vars() 15247 1726867261.16369: done getting variables 15247 1726867261.16439: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:21:01 -0400 (0:00:00.065) 0:00:30.874 ****** 15247 1726867261.16479: entering _queue_task() for managed_node2/set_fact 15247 1726867261.17079: worker is 1 (out of 1 available) 15247 1726867261.17088: exiting _queue_task() for managed_node2/set_fact 15247 1726867261.17098: done queuing things up, now waiting for results queue to drain 15247 1726867261.17100: waiting for pending results... 15247 1726867261.17296: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 15247 1726867261.17309: in run() - task 0affcac9-a3a5-8ce3-1923-0000000003e5 15247 1726867261.17342: variable 'ansible_search_path' from source: unknown 15247 1726867261.17349: variable 'ansible_search_path' from source: unknown 15247 1726867261.17397: calling self._execute() 15247 1726867261.17516: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867261.17531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867261.17555: variable 'omit' from source: magic vars 15247 1726867261.17983: variable 'ansible_distribution_major_version' from source: facts 15247 1726867261.17992: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867261.18174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867261.18522: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867261.18547: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867261.18588: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867261.18635: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867261.18725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867261.18848: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867261.18852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867261.18854: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867261.18924: variable '__network_is_ostree' from source: set_fact 15247 1726867261.18937: Evaluated conditional (not __network_is_ostree is defined): False 15247 1726867261.18946: when evaluation is False, skipping this task 15247 1726867261.18960: _execute() done 15247 1726867261.18968: dumping result to json 15247 1726867261.18975: done dumping result, returning 15247 1726867261.18990: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-8ce3-1923-0000000003e5] 15247 1726867261.19001: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000003e5 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 15247 1726867261.19222: no more pending results, returning what we have 15247 1726867261.19226: results queue empty 15247 1726867261.19228: checking for any_errors_fatal 15247 1726867261.19236: done checking for any_errors_fatal 15247 1726867261.19237: checking for max_fail_percentage 15247 1726867261.19238: done checking for max_fail_percentage 15247 1726867261.19239: checking to see if all hosts have failed and the running result is not ok 15247 1726867261.19240: done checking to see if all hosts have failed 15247 1726867261.19241: getting the remaining hosts for this loop 15247 1726867261.19242: done getting the remaining hosts for this loop 15247 1726867261.19247: getting the next task for host managed_node2 15247 1726867261.19258: done getting next task for host managed_node2 15247 1726867261.19263: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 15247 1726867261.19266: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867261.19390: getting variables 15247 1726867261.19392: in VariableManager get_vars() 15247 1726867261.19439: Calling all_inventory to load vars for managed_node2 15247 1726867261.19441: Calling groups_inventory to load vars for managed_node2 15247 1726867261.19444: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867261.19456: Calling all_plugins_play to load vars for managed_node2 15247 1726867261.19459: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867261.19463: Calling groups_plugins_play to load vars for managed_node2 15247 1726867261.19992: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000003e5 15247 1726867261.19995: WORKER PROCESS EXITING 15247 1726867261.20987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867261.22579: done with get_vars() 15247 1726867261.22603: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:21:01 -0400 (0:00:00.062) 0:00:30.936 ****** 15247 1726867261.22709: entering _queue_task() for managed_node2/service_facts 15247 1726867261.23029: worker is 1 (out of 1 available) 15247 1726867261.23042: exiting _queue_task() for managed_node2/service_facts 15247 1726867261.23056: done queuing things up, now waiting for results queue to drain 15247 1726867261.23172: waiting for pending results... 15247 1726867261.23361: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 15247 1726867261.23506: in run() - task 0affcac9-a3a5-8ce3-1923-0000000003e7 15247 1726867261.23533: variable 'ansible_search_path' from source: unknown 15247 1726867261.23540: variable 'ansible_search_path' from source: unknown 15247 1726867261.23583: calling self._execute() 15247 1726867261.23692: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867261.23705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867261.23733: variable 'omit' from source: magic vars 15247 1726867261.24131: variable 'ansible_distribution_major_version' from source: facts 15247 1726867261.24148: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867261.24265: variable 'omit' from source: magic vars 15247 1726867261.24268: variable 'omit' from source: magic vars 15247 1726867261.24271: variable 'omit' from source: magic vars 15247 1726867261.24316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867261.24356: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867261.24388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867261.24414: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867261.24431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867261.24467: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867261.24476: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867261.24495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867261.24611: Set connection var ansible_shell_executable to /bin/sh 15247 1726867261.24624: Set connection var ansible_connection to ssh 15247 1726867261.24631: Set connection var ansible_shell_type to sh 15247 1726867261.24681: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867261.24684: Set connection var ansible_timeout to 10 15247 1726867261.24686: Set connection var ansible_pipelining to False 15247 1726867261.24688: variable 'ansible_shell_executable' from source: unknown 15247 1726867261.24690: variable 'ansible_connection' from source: unknown 15247 1726867261.24694: variable 'ansible_module_compression' from source: unknown 15247 1726867261.24699: variable 'ansible_shell_type' from source: unknown 15247 1726867261.24709: variable 'ansible_shell_executable' from source: unknown 15247 1726867261.24721: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867261.24728: variable 'ansible_pipelining' from source: unknown 15247 1726867261.24733: variable 'ansible_timeout' from source: unknown 15247 1726867261.24739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867261.24933: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867261.25038: variable 'omit' from source: magic vars 15247 1726867261.25041: starting attempt loop 15247 1726867261.25043: running the handler 15247 1726867261.25045: _low_level_execute_command(): starting 15247 1726867261.25047: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867261.25799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867261.25826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867261.25843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867261.25860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867261.25954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867261.27658: stdout chunk (state=3): >>>/root <<< 15247 1726867261.27822: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867261.27826: stdout chunk (state=3): >>><<< 15247 1726867261.27828: stderr chunk (state=3): >>><<< 15247 1726867261.27943: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867261.27947: _low_level_execute_command(): starting 15247 1726867261.27950: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867261.278516-16692-43937187850987 `" && echo ansible-tmp-1726867261.278516-16692-43937187850987="` echo /root/.ansible/tmp/ansible-tmp-1726867261.278516-16692-43937187850987 `" ) && sleep 0' 15247 1726867261.28497: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867261.28559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867261.28576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867261.28597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867261.28672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867261.30660: stdout chunk (state=3): >>>ansible-tmp-1726867261.278516-16692-43937187850987=/root/.ansible/tmp/ansible-tmp-1726867261.278516-16692-43937187850987 <<< 15247 1726867261.30794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867261.30797: stdout chunk (state=3): >>><<< 15247 1726867261.30800: stderr chunk (state=3): >>><<< 15247 1726867261.30817: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867261.278516-16692-43937187850987=/root/.ansible/tmp/ansible-tmp-1726867261.278516-16692-43937187850987 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867261.30897: variable 'ansible_module_compression' from source: unknown 15247 1726867261.30983: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 15247 1726867261.30986: variable 'ansible_facts' from source: unknown 15247 1726867261.31074: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867261.278516-16692-43937187850987/AnsiballZ_service_facts.py 15247 1726867261.31306: Sending initial data 15247 1726867261.31309: Sent initial data (160 bytes) 15247 1726867261.31988: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867261.32051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867261.32129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867261.33723: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867261.33882: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867261.33920: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmphltggq6l /root/.ansible/tmp/ansible-tmp-1726867261.278516-16692-43937187850987/AnsiballZ_service_facts.py <<< 15247 1726867261.33924: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867261.278516-16692-43937187850987/AnsiballZ_service_facts.py" <<< 15247 1726867261.33983: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmphltggq6l" to remote "/root/.ansible/tmp/ansible-tmp-1726867261.278516-16692-43937187850987/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867261.278516-16692-43937187850987/AnsiballZ_service_facts.py" <<< 15247 1726867261.34872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867261.34905: stderr chunk (state=3): >>><<< 15247 1726867261.34919: stdout chunk (state=3): >>><<< 15247 1726867261.35002: done transferring module to remote 15247 1726867261.35082: _low_level_execute_command(): starting 15247 1726867261.35086: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867261.278516-16692-43937187850987/ /root/.ansible/tmp/ansible-tmp-1726867261.278516-16692-43937187850987/AnsiballZ_service_facts.py && sleep 0' 15247 1726867261.35685: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867261.35699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867261.35715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867261.35764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867261.35841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867261.35861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867261.35892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867261.35959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867261.37921: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867261.37924: stdout chunk (state=3): >>><<< 15247 1726867261.37931: stderr chunk (state=3): >>><<< 15247 1726867261.38032: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867261.38036: _low_level_execute_command(): starting 15247 1726867261.38038: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867261.278516-16692-43937187850987/AnsiballZ_service_facts.py && sleep 0' 15247 1726867261.38695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867261.38720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867261.38737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867261.38758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867261.38842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867262.99664: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 15247 1726867262.99741: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 15247 1726867262.99782: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 15247 1726867263.01236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867263.01250: stderr chunk (state=3): >>>Shared connection to 10.31.12.116 closed. <<< 15247 1726867263.01318: stderr chunk (state=3): >>><<< 15247 1726867263.01321: stdout chunk (state=3): >>><<< 15247 1726867263.01484: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867263.14228: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867261.278516-16692-43937187850987/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867263.14291: _low_level_execute_command(): starting 15247 1726867263.14341: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867261.278516-16692-43937187850987/ > /dev/null 2>&1 && sleep 0' 15247 1726867263.15647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867263.15737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867263.15755: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867263.15790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867263.15890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867263.17836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867263.17839: stdout chunk (state=3): >>><<< 15247 1726867263.17842: stderr chunk (state=3): >>><<< 15247 1726867263.17856: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867263.17979: handler run complete 15247 1726867263.18071: variable 'ansible_facts' from source: unknown 15247 1726867263.18235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867263.18867: variable 'ansible_facts' from source: unknown 15247 1726867263.19096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867263.19433: attempt loop complete, returning result 15247 1726867263.19445: _execute() done 15247 1726867263.19451: dumping result to json 15247 1726867263.19601: done dumping result, returning 15247 1726867263.19748: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-8ce3-1923-0000000003e7] 15247 1726867263.19751: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000003e7 15247 1726867263.27980: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000003e7 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15247 1726867263.28030: no more pending results, returning what we have 15247 1726867263.28032: results queue empty 15247 1726867263.28033: checking for any_errors_fatal 15247 1726867263.28035: done checking for any_errors_fatal 15247 1726867263.28036: checking for max_fail_percentage 15247 1726867263.28037: done checking for max_fail_percentage 15247 1726867263.28038: checking to see if all hosts have failed and the running result is not ok 15247 1726867263.28039: done checking to see if all hosts have failed 15247 1726867263.28053: getting the remaining hosts for this loop 15247 1726867263.28070: done getting the remaining hosts for this loop 15247 1726867263.28074: getting the next task for host managed_node2 15247 1726867263.28079: done getting next task for host managed_node2 15247 1726867263.28096: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 15247 1726867263.28104: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867263.28127: getting variables 15247 1726867263.28129: in VariableManager get_vars() 15247 1726867263.28169: Calling all_inventory to load vars for managed_node2 15247 1726867263.28171: Calling groups_inventory to load vars for managed_node2 15247 1726867263.28202: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867263.28212: Calling all_plugins_play to load vars for managed_node2 15247 1726867263.28215: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867263.28218: Calling groups_plugins_play to load vars for managed_node2 15247 1726867263.28758: WORKER PROCESS EXITING 15247 1726867263.30227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867263.32123: done with get_vars() 15247 1726867263.32145: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:21:03 -0400 (0:00:02.095) 0:00:33.031 ****** 15247 1726867263.32231: entering _queue_task() for managed_node2/package_facts 15247 1726867263.32635: worker is 1 (out of 1 available) 15247 1726867263.32648: exiting _queue_task() for managed_node2/package_facts 15247 1726867263.32661: done queuing things up, now waiting for results queue to drain 15247 1726867263.32669: waiting for pending results... 15247 1726867263.32975: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 15247 1726867263.33180: in run() - task 0affcac9-a3a5-8ce3-1923-0000000003e8 15247 1726867263.33185: variable 'ansible_search_path' from source: unknown 15247 1726867263.33188: variable 'ansible_search_path' from source: unknown 15247 1726867263.33206: calling self._execute() 15247 1726867263.33351: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867263.33363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867263.33388: variable 'omit' from source: magic vars 15247 1726867263.33881: variable 'ansible_distribution_major_version' from source: facts 15247 1726867263.33932: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867263.33942: variable 'omit' from source: magic vars 15247 1726867263.33995: variable 'omit' from source: magic vars 15247 1726867263.34042: variable 'omit' from source: magic vars 15247 1726867263.34122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867263.34132: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867263.34166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867263.34199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867263.34240: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867263.34312: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867263.34328: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867263.34348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867263.34518: Set connection var ansible_shell_executable to /bin/sh 15247 1726867263.34552: Set connection var ansible_connection to ssh 15247 1726867263.34556: Set connection var ansible_shell_type to sh 15247 1726867263.34558: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867263.34593: Set connection var ansible_timeout to 10 15247 1726867263.34596: Set connection var ansible_pipelining to False 15247 1726867263.34645: variable 'ansible_shell_executable' from source: unknown 15247 1726867263.34648: variable 'ansible_connection' from source: unknown 15247 1726867263.34651: variable 'ansible_module_compression' from source: unknown 15247 1726867263.34653: variable 'ansible_shell_type' from source: unknown 15247 1726867263.34688: variable 'ansible_shell_executable' from source: unknown 15247 1726867263.34692: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867263.34694: variable 'ansible_pipelining' from source: unknown 15247 1726867263.34696: variable 'ansible_timeout' from source: unknown 15247 1726867263.34746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867263.35044: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867263.35060: variable 'omit' from source: magic vars 15247 1726867263.35070: starting attempt loop 15247 1726867263.35076: running the handler 15247 1726867263.35121: _low_level_execute_command(): starting 15247 1726867263.35124: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867263.36643: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867263.36735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867263.36782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867263.36937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867263.38604: stdout chunk (state=3): >>>/root <<< 15247 1726867263.38748: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867263.38751: stdout chunk (state=3): >>><<< 15247 1726867263.38754: stderr chunk (state=3): >>><<< 15247 1726867263.38868: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867263.38872: _low_level_execute_command(): starting 15247 1726867263.38875: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867263.387758-16796-183668203843062 `" && echo ansible-tmp-1726867263.387758-16796-183668203843062="` echo /root/.ansible/tmp/ansible-tmp-1726867263.387758-16796-183668203843062 `" ) && sleep 0' 15247 1726867263.39460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867263.39572: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867263.39664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867263.39823: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867263.39908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867263.41795: stdout chunk (state=3): >>>ansible-tmp-1726867263.387758-16796-183668203843062=/root/.ansible/tmp/ansible-tmp-1726867263.387758-16796-183668203843062 <<< 15247 1726867263.42067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867263.42075: stdout chunk (state=3): >>><<< 15247 1726867263.42079: stderr chunk (state=3): >>><<< 15247 1726867263.42293: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867263.387758-16796-183668203843062=/root/.ansible/tmp/ansible-tmp-1726867263.387758-16796-183668203843062 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867263.42297: variable 'ansible_module_compression' from source: unknown 15247 1726867263.42300: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 15247 1726867263.42567: variable 'ansible_facts' from source: unknown 15247 1726867263.43454: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867263.387758-16796-183668203843062/AnsiballZ_package_facts.py 15247 1726867263.43861: Sending initial data 15247 1726867263.43871: Sent initial data (161 bytes) 15247 1726867263.44471: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867263.44505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867263.44520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867263.44910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867263.44978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867263.46639: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 15247 1726867263.46800: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867263.387758-16796-183668203843062/AnsiballZ_package_facts.py" <<< 15247 1726867263.46805: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp53w0x7jg /root/.ansible/tmp/ansible-tmp-1726867263.387758-16796-183668203843062/AnsiballZ_package_facts.py <<< 15247 1726867263.46893: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 15247 1726867263.46909: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp53w0x7jg" to remote "/root/.ansible/tmp/ansible-tmp-1726867263.387758-16796-183668203843062/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867263.387758-16796-183668203843062/AnsiballZ_package_facts.py" <<< 15247 1726867263.50522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867263.50533: stdout chunk (state=3): >>><<< 15247 1726867263.50544: stderr chunk (state=3): >>><<< 15247 1726867263.50767: done transferring module to remote 15247 1726867263.50771: _low_level_execute_command(): starting 15247 1726867263.50773: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867263.387758-16796-183668203843062/ /root/.ansible/tmp/ansible-tmp-1726867263.387758-16796-183668203843062/AnsiballZ_package_facts.py && sleep 0' 15247 1726867263.51583: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867263.51588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867263.51590: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867263.51592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867263.51594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867263.51692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867263.51761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867263.53612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867263.53641: stderr chunk (state=3): >>><<< 15247 1726867263.53651: stdout chunk (state=3): >>><<< 15247 1726867263.53675: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867263.53686: _low_level_execute_command(): starting 15247 1726867263.53695: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867263.387758-16796-183668203843062/AnsiballZ_package_facts.py && sleep 0' 15247 1726867263.54266: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867263.54283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867263.54301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867263.54320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867263.54337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867263.54349: stderr chunk (state=3): >>>debug2: match not found <<< 15247 1726867263.54363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867263.54388: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867263.54475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867263.54636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867263.54688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867264.00123: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 15247 1726867264.00152: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 15247 1726867264.01550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867264.01554: stderr chunk (state=3): >>>Shared connection to 10.31.12.116 closed. <<< 15247 1726867264.01557: stdout chunk (state=3): >>><<< 15247 1726867264.01560: stderr chunk (state=3): >>><<< 15247 1726867264.01566: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867264.06258: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867263.387758-16796-183668203843062/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867264.06270: _low_level_execute_command(): starting 15247 1726867264.06274: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867263.387758-16796-183668203843062/ > /dev/null 2>&1 && sleep 0' 15247 1726867264.08712: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867264.08745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867264.08748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 15247 1726867264.08751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867264.08753: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867264.08837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867264.08949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867264.09165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867264.10905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867264.10936: stderr chunk (state=3): >>><<< 15247 1726867264.10946: stdout chunk (state=3): >>><<< 15247 1726867264.10965: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867264.10995: handler run complete 15247 1726867264.12719: variable 'ansible_facts' from source: unknown 15247 1726867264.13294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867264.16284: variable 'ansible_facts' from source: unknown 15247 1726867264.17104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867264.19245: attempt loop complete, returning result 15247 1726867264.19248: _execute() done 15247 1726867264.19251: dumping result to json 15247 1726867264.19949: done dumping result, returning 15247 1726867264.20293: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-8ce3-1923-0000000003e8] 15247 1726867264.20296: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000003e8 15247 1726867264.25585: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000003e8 15247 1726867264.25589: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15247 1726867264.25726: no more pending results, returning what we have 15247 1726867264.25729: results queue empty 15247 1726867264.25730: checking for any_errors_fatal 15247 1726867264.25735: done checking for any_errors_fatal 15247 1726867264.25736: checking for max_fail_percentage 15247 1726867264.25737: done checking for max_fail_percentage 15247 1726867264.25738: checking to see if all hosts have failed and the running result is not ok 15247 1726867264.25739: done checking to see if all hosts have failed 15247 1726867264.25740: getting the remaining hosts for this loop 15247 1726867264.25741: done getting the remaining hosts for this loop 15247 1726867264.25744: getting the next task for host managed_node2 15247 1726867264.25751: done getting next task for host managed_node2 15247 1726867264.25754: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 15247 1726867264.25756: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867264.25883: getting variables 15247 1726867264.25885: in VariableManager get_vars() 15247 1726867264.25925: Calling all_inventory to load vars for managed_node2 15247 1726867264.25929: Calling groups_inventory to load vars for managed_node2 15247 1726867264.25931: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867264.25942: Calling all_plugins_play to load vars for managed_node2 15247 1726867264.25945: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867264.25949: Calling groups_plugins_play to load vars for managed_node2 15247 1726867264.27799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867264.30427: done with get_vars() 15247 1726867264.30456: done getting variables 15247 1726867264.30533: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:21:04 -0400 (0:00:00.983) 0:00:34.015 ****** 15247 1726867264.30582: entering _queue_task() for managed_node2/debug 15247 1726867264.31011: worker is 1 (out of 1 available) 15247 1726867264.31023: exiting _queue_task() for managed_node2/debug 15247 1726867264.31034: done queuing things up, now waiting for results queue to drain 15247 1726867264.31035: waiting for pending results... 15247 1726867264.31325: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 15247 1726867264.31342: in run() - task 0affcac9-a3a5-8ce3-1923-00000000005b 15247 1726867264.31365: variable 'ansible_search_path' from source: unknown 15247 1726867264.31374: variable 'ansible_search_path' from source: unknown 15247 1726867264.31426: calling self._execute() 15247 1726867264.31538: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867264.31556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867264.31573: variable 'omit' from source: magic vars 15247 1726867264.32283: variable 'ansible_distribution_major_version' from source: facts 15247 1726867264.32291: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867264.32295: variable 'omit' from source: magic vars 15247 1726867264.32322: variable 'omit' from source: magic vars 15247 1726867264.32437: variable 'network_provider' from source: set_fact 15247 1726867264.32465: variable 'omit' from source: magic vars 15247 1726867264.32515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867264.32560: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867264.32586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867264.32620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867264.32636: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867264.32682: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867264.32686: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867264.32689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867264.32837: Set connection var ansible_shell_executable to /bin/sh 15247 1726867264.32841: Set connection var ansible_connection to ssh 15247 1726867264.32844: Set connection var ansible_shell_type to sh 15247 1726867264.32846: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867264.32853: Set connection var ansible_timeout to 10 15247 1726867264.32856: Set connection var ansible_pipelining to False 15247 1726867264.32881: variable 'ansible_shell_executable' from source: unknown 15247 1726867264.32891: variable 'ansible_connection' from source: unknown 15247 1726867264.32946: variable 'ansible_module_compression' from source: unknown 15247 1726867264.32949: variable 'ansible_shell_type' from source: unknown 15247 1726867264.32952: variable 'ansible_shell_executable' from source: unknown 15247 1726867264.32954: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867264.32955: variable 'ansible_pipelining' from source: unknown 15247 1726867264.32963: variable 'ansible_timeout' from source: unknown 15247 1726867264.32966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867264.33090: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867264.33108: variable 'omit' from source: magic vars 15247 1726867264.33164: starting attempt loop 15247 1726867264.33167: running the handler 15247 1726867264.33202: handler run complete 15247 1726867264.33225: attempt loop complete, returning result 15247 1726867264.33233: _execute() done 15247 1726867264.33240: dumping result to json 15247 1726867264.33247: done dumping result, returning 15247 1726867264.33272: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-8ce3-1923-00000000005b] 15247 1726867264.33275: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000005b ok: [managed_node2] => {} MSG: Using network provider: nm 15247 1726867264.33544: no more pending results, returning what we have 15247 1726867264.33547: results queue empty 15247 1726867264.33549: checking for any_errors_fatal 15247 1726867264.33559: done checking for any_errors_fatal 15247 1726867264.33560: checking for max_fail_percentage 15247 1726867264.33562: done checking for max_fail_percentage 15247 1726867264.33563: checking to see if all hosts have failed and the running result is not ok 15247 1726867264.33564: done checking to see if all hosts have failed 15247 1726867264.33564: getting the remaining hosts for this loop 15247 1726867264.33566: done getting the remaining hosts for this loop 15247 1726867264.33569: getting the next task for host managed_node2 15247 1726867264.33579: done getting next task for host managed_node2 15247 1726867264.33584: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15247 1726867264.33586: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867264.33595: getting variables 15247 1726867264.33597: in VariableManager get_vars() 15247 1726867264.33637: Calling all_inventory to load vars for managed_node2 15247 1726867264.33639: Calling groups_inventory to load vars for managed_node2 15247 1726867264.33642: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867264.33652: Calling all_plugins_play to load vars for managed_node2 15247 1726867264.33656: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867264.33660: Calling groups_plugins_play to load vars for managed_node2 15247 1726867264.34302: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000005b 15247 1726867264.34306: WORKER PROCESS EXITING 15247 1726867264.35346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867264.38614: done with get_vars() 15247 1726867264.38645: done getting variables 15247 1726867264.38724: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:21:04 -0400 (0:00:00.081) 0:00:34.097 ****** 15247 1726867264.38768: entering _queue_task() for managed_node2/fail 15247 1726867264.39189: worker is 1 (out of 1 available) 15247 1726867264.39202: exiting _queue_task() for managed_node2/fail 15247 1726867264.39218: done queuing things up, now waiting for results queue to drain 15247 1726867264.39220: waiting for pending results... 15247 1726867264.39501: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 15247 1726867264.39659: in run() - task 0affcac9-a3a5-8ce3-1923-00000000005c 15247 1726867264.39694: variable 'ansible_search_path' from source: unknown 15247 1726867264.39705: variable 'ansible_search_path' from source: unknown 15247 1726867264.39758: calling self._execute() 15247 1726867264.39858: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867264.39876: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867264.39906: variable 'omit' from source: magic vars 15247 1726867264.40386: variable 'ansible_distribution_major_version' from source: facts 15247 1726867264.40420: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867264.40723: variable 'network_state' from source: role '' defaults 15247 1726867264.40726: Evaluated conditional (network_state != {}): False 15247 1726867264.40729: when evaluation is False, skipping this task 15247 1726867264.40732: _execute() done 15247 1726867264.40737: dumping result to json 15247 1726867264.40750: done dumping result, returning 15247 1726867264.40829: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-8ce3-1923-00000000005c] 15247 1726867264.40836: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000005c 15247 1726867264.40925: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000005c 15247 1726867264.40929: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15247 1726867264.41007: no more pending results, returning what we have 15247 1726867264.41023: results queue empty 15247 1726867264.41025: checking for any_errors_fatal 15247 1726867264.41036: done checking for any_errors_fatal 15247 1726867264.41037: checking for max_fail_percentage 15247 1726867264.41039: done checking for max_fail_percentage 15247 1726867264.41040: checking to see if all hosts have failed and the running result is not ok 15247 1726867264.41041: done checking to see if all hosts have failed 15247 1726867264.41042: getting the remaining hosts for this loop 15247 1726867264.41043: done getting the remaining hosts for this loop 15247 1726867264.41046: getting the next task for host managed_node2 15247 1726867264.41055: done getting next task for host managed_node2 15247 1726867264.41058: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15247 1726867264.41061: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867264.41079: getting variables 15247 1726867264.41081: in VariableManager get_vars() 15247 1726867264.41302: Calling all_inventory to load vars for managed_node2 15247 1726867264.41305: Calling groups_inventory to load vars for managed_node2 15247 1726867264.41312: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867264.41329: Calling all_plugins_play to load vars for managed_node2 15247 1726867264.41333: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867264.41336: Calling groups_plugins_play to load vars for managed_node2 15247 1726867264.43484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867264.45135: done with get_vars() 15247 1726867264.45158: done getting variables 15247 1726867264.45217: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:21:04 -0400 (0:00:00.064) 0:00:34.162 ****** 15247 1726867264.45245: entering _queue_task() for managed_node2/fail 15247 1726867264.45608: worker is 1 (out of 1 available) 15247 1726867264.45623: exiting _queue_task() for managed_node2/fail 15247 1726867264.45633: done queuing things up, now waiting for results queue to drain 15247 1726867264.45634: waiting for pending results... 15247 1726867264.45963: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 15247 1726867264.45969: in run() - task 0affcac9-a3a5-8ce3-1923-00000000005d 15247 1726867264.46059: variable 'ansible_search_path' from source: unknown 15247 1726867264.46063: variable 'ansible_search_path' from source: unknown 15247 1726867264.46065: calling self._execute() 15247 1726867264.46141: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867264.46155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867264.46243: variable 'omit' from source: magic vars 15247 1726867264.47279: variable 'ansible_distribution_major_version' from source: facts 15247 1726867264.47390: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867264.47572: variable 'network_state' from source: role '' defaults 15247 1726867264.47627: Evaluated conditional (network_state != {}): False 15247 1726867264.47654: when evaluation is False, skipping this task 15247 1726867264.47709: _execute() done 15247 1726867264.47713: dumping result to json 15247 1726867264.47716: done dumping result, returning 15247 1726867264.47721: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-8ce3-1923-00000000005d] 15247 1726867264.47825: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000005d skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15247 1726867264.47974: no more pending results, returning what we have 15247 1726867264.47984: results queue empty 15247 1726867264.47985: checking for any_errors_fatal 15247 1726867264.47994: done checking for any_errors_fatal 15247 1726867264.47995: checking for max_fail_percentage 15247 1726867264.47997: done checking for max_fail_percentage 15247 1726867264.47998: checking to see if all hosts have failed and the running result is not ok 15247 1726867264.47999: done checking to see if all hosts have failed 15247 1726867264.48000: getting the remaining hosts for this loop 15247 1726867264.48001: done getting the remaining hosts for this loop 15247 1726867264.48008: getting the next task for host managed_node2 15247 1726867264.48016: done getting next task for host managed_node2 15247 1726867264.48020: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15247 1726867264.48022: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867264.48037: getting variables 15247 1726867264.48039: in VariableManager get_vars() 15247 1726867264.48288: Calling all_inventory to load vars for managed_node2 15247 1726867264.48292: Calling groups_inventory to load vars for managed_node2 15247 1726867264.48295: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867264.48301: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000005d 15247 1726867264.48303: WORKER PROCESS EXITING 15247 1726867264.48372: Calling all_plugins_play to load vars for managed_node2 15247 1726867264.48376: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867264.48387: Calling groups_plugins_play to load vars for managed_node2 15247 1726867264.50382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867264.52503: done with get_vars() 15247 1726867264.52568: done getting variables 15247 1726867264.52627: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:21:04 -0400 (0:00:00.074) 0:00:34.236 ****** 15247 1726867264.52662: entering _queue_task() for managed_node2/fail 15247 1726867264.52995: worker is 1 (out of 1 available) 15247 1726867264.53008: exiting _queue_task() for managed_node2/fail 15247 1726867264.53019: done queuing things up, now waiting for results queue to drain 15247 1726867264.53021: waiting for pending results... 15247 1726867264.53322: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 15247 1726867264.53551: in run() - task 0affcac9-a3a5-8ce3-1923-00000000005e 15247 1726867264.53556: variable 'ansible_search_path' from source: unknown 15247 1726867264.53559: variable 'ansible_search_path' from source: unknown 15247 1726867264.53562: calling self._execute() 15247 1726867264.53662: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867264.53676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867264.53700: variable 'omit' from source: magic vars 15247 1726867264.54116: variable 'ansible_distribution_major_version' from source: facts 15247 1726867264.54283: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867264.54342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867264.57115: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867264.57195: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867264.57239: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867264.57286: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867264.57321: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867264.57402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867264.57453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867264.57488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867264.57540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867264.57563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867264.57700: variable 'ansible_distribution_major_version' from source: facts 15247 1726867264.57704: Evaluated conditional (ansible_distribution_major_version | int > 9): True 15247 1726867264.57840: variable 'ansible_distribution' from source: facts 15247 1726867264.57856: variable '__network_rh_distros' from source: role '' defaults 15247 1726867264.57878: Evaluated conditional (ansible_distribution in __network_rh_distros): True 15247 1726867264.58247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867264.58251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867264.58253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867264.58282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867264.58308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867264.58371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867264.58401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867264.58435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867264.58485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867264.58505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867264.58557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867264.58601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867264.58634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867264.58675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867264.58797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867264.59039: variable 'network_connections' from source: play vars 15247 1726867264.59054: variable 'profile' from source: play vars 15247 1726867264.59131: variable 'profile' from source: play vars 15247 1726867264.59144: variable 'interface' from source: set_fact 15247 1726867264.59209: variable 'interface' from source: set_fact 15247 1726867264.59232: variable 'network_state' from source: role '' defaults 15247 1726867264.59325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867264.59521: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867264.59582: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867264.59627: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867264.59669: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867264.59726: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867264.59773: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867264.59807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867264.59847: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867264.59886: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 15247 1726867264.59993: when evaluation is False, skipping this task 15247 1726867264.59996: _execute() done 15247 1726867264.59999: dumping result to json 15247 1726867264.60001: done dumping result, returning 15247 1726867264.60004: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-8ce3-1923-00000000005e] 15247 1726867264.60006: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000005e skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 15247 1726867264.60123: no more pending results, returning what we have 15247 1726867264.60126: results queue empty 15247 1726867264.60127: checking for any_errors_fatal 15247 1726867264.60138: done checking for any_errors_fatal 15247 1726867264.60139: checking for max_fail_percentage 15247 1726867264.60141: done checking for max_fail_percentage 15247 1726867264.60141: checking to see if all hosts have failed and the running result is not ok 15247 1726867264.60142: done checking to see if all hosts have failed 15247 1726867264.60143: getting the remaining hosts for this loop 15247 1726867264.60144: done getting the remaining hosts for this loop 15247 1726867264.60148: getting the next task for host managed_node2 15247 1726867264.60154: done getting next task for host managed_node2 15247 1726867264.60179: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15247 1726867264.60182: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867264.60199: getting variables 15247 1726867264.60201: in VariableManager get_vars() 15247 1726867264.60238: Calling all_inventory to load vars for managed_node2 15247 1726867264.60386: Calling groups_inventory to load vars for managed_node2 15247 1726867264.60389: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867264.60399: Calling all_plugins_play to load vars for managed_node2 15247 1726867264.60402: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867264.60405: Calling groups_plugins_play to load vars for managed_node2 15247 1726867264.60975: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000005e 15247 1726867264.60980: WORKER PROCESS EXITING 15247 1726867264.62185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867264.64704: done with get_vars() 15247 1726867264.64757: done getting variables 15247 1726867264.64849: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:21:04 -0400 (0:00:00.122) 0:00:34.358 ****** 15247 1726867264.64881: entering _queue_task() for managed_node2/dnf 15247 1726867264.65291: worker is 1 (out of 1 available) 15247 1726867264.65304: exiting _queue_task() for managed_node2/dnf 15247 1726867264.65316: done queuing things up, now waiting for results queue to drain 15247 1726867264.65317: waiting for pending results... 15247 1726867264.65835: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 15247 1726867264.65867: in run() - task 0affcac9-a3a5-8ce3-1923-00000000005f 15247 1726867264.65886: variable 'ansible_search_path' from source: unknown 15247 1726867264.65890: variable 'ansible_search_path' from source: unknown 15247 1726867264.65931: calling self._execute() 15247 1726867264.66025: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867264.66035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867264.66041: variable 'omit' from source: magic vars 15247 1726867264.66698: variable 'ansible_distribution_major_version' from source: facts 15247 1726867264.66709: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867264.66950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867264.69633: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867264.69767: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867264.69771: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867264.69773: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867264.69802: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867264.69878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867264.69923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867264.69947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867264.69987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867264.70001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867264.70170: variable 'ansible_distribution' from source: facts 15247 1726867264.70174: variable 'ansible_distribution_major_version' from source: facts 15247 1726867264.70189: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 15247 1726867264.70325: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867264.70527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867264.70530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867264.70562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867264.70635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867264.70639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867264.70648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867264.70679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867264.70747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867264.70792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867264.70853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867264.70856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867264.70862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867264.70895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867264.70935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867264.70973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867264.71131: variable 'network_connections' from source: play vars 15247 1726867264.71181: variable 'profile' from source: play vars 15247 1726867264.71261: variable 'profile' from source: play vars 15247 1726867264.71265: variable 'interface' from source: set_fact 15247 1726867264.71325: variable 'interface' from source: set_fact 15247 1726867264.71406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867264.71650: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867264.71724: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867264.71728: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867264.71758: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867264.71799: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867264.71821: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867264.71892: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867264.71895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867264.71953: variable '__network_team_connections_defined' from source: role '' defaults 15247 1726867264.72437: variable 'network_connections' from source: play vars 15247 1726867264.72440: variable 'profile' from source: play vars 15247 1726867264.72442: variable 'profile' from source: play vars 15247 1726867264.72448: variable 'interface' from source: set_fact 15247 1726867264.72495: variable 'interface' from source: set_fact 15247 1726867264.72551: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15247 1726867264.72554: when evaluation is False, skipping this task 15247 1726867264.72557: _execute() done 15247 1726867264.72559: dumping result to json 15247 1726867264.72561: done dumping result, returning 15247 1726867264.72570: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-8ce3-1923-00000000005f] 15247 1726867264.72576: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000005f 15247 1726867264.72702: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000005f 15247 1726867264.72706: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15247 1726867264.72760: no more pending results, returning what we have 15247 1726867264.72764: results queue empty 15247 1726867264.72766: checking for any_errors_fatal 15247 1726867264.72775: done checking for any_errors_fatal 15247 1726867264.72776: checking for max_fail_percentage 15247 1726867264.72780: done checking for max_fail_percentage 15247 1726867264.72781: checking to see if all hosts have failed and the running result is not ok 15247 1726867264.72782: done checking to see if all hosts have failed 15247 1726867264.72783: getting the remaining hosts for this loop 15247 1726867264.72785: done getting the remaining hosts for this loop 15247 1726867264.72789: getting the next task for host managed_node2 15247 1726867264.72795: done getting next task for host managed_node2 15247 1726867264.72800: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15247 1726867264.72801: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867264.72936: getting variables 15247 1726867264.72938: in VariableManager get_vars() 15247 1726867264.73125: Calling all_inventory to load vars for managed_node2 15247 1726867264.73128: Calling groups_inventory to load vars for managed_node2 15247 1726867264.73131: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867264.73141: Calling all_plugins_play to load vars for managed_node2 15247 1726867264.73144: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867264.73148: Calling groups_plugins_play to load vars for managed_node2 15247 1726867264.75270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867264.77365: done with get_vars() 15247 1726867264.77388: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 15247 1726867264.77470: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:21:04 -0400 (0:00:00.126) 0:00:34.484 ****** 15247 1726867264.77501: entering _queue_task() for managed_node2/yum 15247 1726867264.77801: worker is 1 (out of 1 available) 15247 1726867264.77811: exiting _queue_task() for managed_node2/yum 15247 1726867264.77824: done queuing things up, now waiting for results queue to drain 15247 1726867264.77825: waiting for pending results... 15247 1726867264.78105: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 15247 1726867264.78221: in run() - task 0affcac9-a3a5-8ce3-1923-000000000060 15247 1726867264.78230: variable 'ansible_search_path' from source: unknown 15247 1726867264.78234: variable 'ansible_search_path' from source: unknown 15247 1726867264.78269: calling self._execute() 15247 1726867264.78785: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867264.78789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867264.78792: variable 'omit' from source: magic vars 15247 1726867264.78795: variable 'ansible_distribution_major_version' from source: facts 15247 1726867264.78798: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867264.78938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867264.83156: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867264.83292: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867264.83325: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867264.83357: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867264.83637: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867264.84009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867264.84049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867264.84079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867264.84173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867264.84195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867264.84483: variable 'ansible_distribution_major_version' from source: facts 15247 1726867264.84541: Evaluated conditional (ansible_distribution_major_version | int < 8): False 15247 1726867264.84545: when evaluation is False, skipping this task 15247 1726867264.84547: _execute() done 15247 1726867264.84550: dumping result to json 15247 1726867264.84553: done dumping result, returning 15247 1726867264.84560: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-8ce3-1923-000000000060] 15247 1726867264.84565: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000060 15247 1726867264.84674: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000060 15247 1726867264.84681: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 15247 1726867264.84735: no more pending results, returning what we have 15247 1726867264.84739: results queue empty 15247 1726867264.84740: checking for any_errors_fatal 15247 1726867264.84749: done checking for any_errors_fatal 15247 1726867264.84750: checking for max_fail_percentage 15247 1726867264.84752: done checking for max_fail_percentage 15247 1726867264.84753: checking to see if all hosts have failed and the running result is not ok 15247 1726867264.84754: done checking to see if all hosts have failed 15247 1726867264.84755: getting the remaining hosts for this loop 15247 1726867264.84757: done getting the remaining hosts for this loop 15247 1726867264.84761: getting the next task for host managed_node2 15247 1726867264.84768: done getting next task for host managed_node2 15247 1726867264.84773: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15247 1726867264.84775: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867264.84791: getting variables 15247 1726867264.84793: in VariableManager get_vars() 15247 1726867264.84837: Calling all_inventory to load vars for managed_node2 15247 1726867264.84840: Calling groups_inventory to load vars for managed_node2 15247 1726867264.84843: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867264.84854: Calling all_plugins_play to load vars for managed_node2 15247 1726867264.84858: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867264.84862: Calling groups_plugins_play to load vars for managed_node2 15247 1726867264.87236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867264.89494: done with get_vars() 15247 1726867264.89521: done getting variables 15247 1726867264.89580: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:21:04 -0400 (0:00:00.121) 0:00:34.605 ****** 15247 1726867264.89610: entering _queue_task() for managed_node2/fail 15247 1726867264.89905: worker is 1 (out of 1 available) 15247 1726867264.89920: exiting _queue_task() for managed_node2/fail 15247 1726867264.89931: done queuing things up, now waiting for results queue to drain 15247 1726867264.89932: waiting for pending results... 15247 1726867264.90817: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 15247 1726867264.90884: in run() - task 0affcac9-a3a5-8ce3-1923-000000000061 15247 1726867264.91127: variable 'ansible_search_path' from source: unknown 15247 1726867264.91131: variable 'ansible_search_path' from source: unknown 15247 1726867264.91134: calling self._execute() 15247 1726867264.91249: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867264.91263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867264.91384: variable 'omit' from source: magic vars 15247 1726867264.92169: variable 'ansible_distribution_major_version' from source: facts 15247 1726867264.92189: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867264.92424: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867264.92841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867264.98064: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867264.98393: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867264.98473: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867264.98576: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867264.98689: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867264.98891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867264.99190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867264.99194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867264.99196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867264.99199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867264.99201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867264.99307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867264.99416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867264.99461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867264.99530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867264.99661: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867264.99699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867264.99755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867264.99901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867264.99924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867265.00484: variable 'network_connections' from source: play vars 15247 1726867265.00487: variable 'profile' from source: play vars 15247 1726867265.00616: variable 'profile' from source: play vars 15247 1726867265.00628: variable 'interface' from source: set_fact 15247 1726867265.00795: variable 'interface' from source: set_fact 15247 1726867265.00921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867265.01254: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867265.01297: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867265.01391: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867265.01426: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867265.01507: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867265.01600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867265.01710: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.01744: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867265.01982: variable '__network_team_connections_defined' from source: role '' defaults 15247 1726867265.02548: variable 'network_connections' from source: play vars 15247 1726867265.02666: variable 'profile' from source: play vars 15247 1726867265.02670: variable 'profile' from source: play vars 15247 1726867265.02672: variable 'interface' from source: set_fact 15247 1726867265.02801: variable 'interface' from source: set_fact 15247 1726867265.02909: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15247 1726867265.02920: when evaluation is False, skipping this task 15247 1726867265.02927: _execute() done 15247 1726867265.02933: dumping result to json 15247 1726867265.02940: done dumping result, returning 15247 1726867265.02951: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-8ce3-1923-000000000061] 15247 1726867265.02970: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000061 15247 1726867265.03276: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000061 15247 1726867265.03281: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15247 1726867265.03366: no more pending results, returning what we have 15247 1726867265.03370: results queue empty 15247 1726867265.03371: checking for any_errors_fatal 15247 1726867265.03381: done checking for any_errors_fatal 15247 1726867265.03382: checking for max_fail_percentage 15247 1726867265.03384: done checking for max_fail_percentage 15247 1726867265.03385: checking to see if all hosts have failed and the running result is not ok 15247 1726867265.03385: done checking to see if all hosts have failed 15247 1726867265.03386: getting the remaining hosts for this loop 15247 1726867265.03388: done getting the remaining hosts for this loop 15247 1726867265.03391: getting the next task for host managed_node2 15247 1726867265.03398: done getting next task for host managed_node2 15247 1726867265.03402: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 15247 1726867265.03403: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867265.03419: getting variables 15247 1726867265.03422: in VariableManager get_vars() 15247 1726867265.03463: Calling all_inventory to load vars for managed_node2 15247 1726867265.03466: Calling groups_inventory to load vars for managed_node2 15247 1726867265.03469: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867265.03783: Calling all_plugins_play to load vars for managed_node2 15247 1726867265.03788: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867265.03792: Calling groups_plugins_play to load vars for managed_node2 15247 1726867265.06597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867265.10635: done with get_vars() 15247 1726867265.10668: done getting variables 15247 1726867265.10736: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:21:05 -0400 (0:00:00.211) 0:00:34.817 ****** 15247 1726867265.10772: entering _queue_task() for managed_node2/package 15247 1726867265.11642: worker is 1 (out of 1 available) 15247 1726867265.11655: exiting _queue_task() for managed_node2/package 15247 1726867265.11670: done queuing things up, now waiting for results queue to drain 15247 1726867265.11671: waiting for pending results... 15247 1726867265.12499: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 15247 1726867265.12508: in run() - task 0affcac9-a3a5-8ce3-1923-000000000062 15247 1726867265.12533: variable 'ansible_search_path' from source: unknown 15247 1726867265.12543: variable 'ansible_search_path' from source: unknown 15247 1726867265.12587: calling self._execute() 15247 1726867265.12822: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867265.12833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867265.12848: variable 'omit' from source: magic vars 15247 1726867265.13638: variable 'ansible_distribution_major_version' from source: facts 15247 1726867265.13655: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867265.13943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867265.14224: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867265.14278: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867265.14321: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867265.14405: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867265.14530: variable 'network_packages' from source: role '' defaults 15247 1726867265.14646: variable '__network_provider_setup' from source: role '' defaults 15247 1726867265.14661: variable '__network_service_name_default_nm' from source: role '' defaults 15247 1726867265.14735: variable '__network_service_name_default_nm' from source: role '' defaults 15247 1726867265.14750: variable '__network_packages_default_nm' from source: role '' defaults 15247 1726867265.14820: variable '__network_packages_default_nm' from source: role '' defaults 15247 1726867265.15082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867265.17537: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867265.17611: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867265.17653: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867265.17693: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867265.17730: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867265.17807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867265.17852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867265.17891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.17983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867265.17986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867265.18015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867265.18049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867265.18081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.18126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867265.18148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867265.18470: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15247 1726867265.18503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867265.18535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867265.18563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.18614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867265.18642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867265.18737: variable 'ansible_python' from source: facts 15247 1726867265.18779: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15247 1726867265.18866: variable '__network_wpa_supplicant_required' from source: role '' defaults 15247 1726867265.18956: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15247 1726867265.19096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867265.19129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867265.19159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.19206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867265.19231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867265.19338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867265.19350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867265.19370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.19421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867265.19588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867265.19861: variable 'network_connections' from source: play vars 15247 1726867265.19864: variable 'profile' from source: play vars 15247 1726867265.20166: variable 'profile' from source: play vars 15247 1726867265.20276: variable 'interface' from source: set_fact 15247 1726867265.20282: variable 'interface' from source: set_fact 15247 1726867265.20440: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867265.20475: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867265.20511: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.20590: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867265.20709: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867265.21200: variable 'network_connections' from source: play vars 15247 1726867265.21214: variable 'profile' from source: play vars 15247 1726867265.21317: variable 'profile' from source: play vars 15247 1726867265.21331: variable 'interface' from source: set_fact 15247 1726867265.21432: variable 'interface' from source: set_fact 15247 1726867265.21535: variable '__network_packages_default_wireless' from source: role '' defaults 15247 1726867265.21618: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867265.21928: variable 'network_connections' from source: play vars 15247 1726867265.21941: variable 'profile' from source: play vars 15247 1726867265.22009: variable 'profile' from source: play vars 15247 1726867265.22079: variable 'interface' from source: set_fact 15247 1726867265.22119: variable 'interface' from source: set_fact 15247 1726867265.22147: variable '__network_packages_default_team' from source: role '' defaults 15247 1726867265.22230: variable '__network_team_connections_defined' from source: role '' defaults 15247 1726867265.22520: variable 'network_connections' from source: play vars 15247 1726867265.22529: variable 'profile' from source: play vars 15247 1726867265.22593: variable 'profile' from source: play vars 15247 1726867265.22602: variable 'interface' from source: set_fact 15247 1726867265.22699: variable 'interface' from source: set_fact 15247 1726867265.22763: variable '__network_service_name_default_initscripts' from source: role '' defaults 15247 1726867265.22826: variable '__network_service_name_default_initscripts' from source: role '' defaults 15247 1726867265.22885: variable '__network_packages_default_initscripts' from source: role '' defaults 15247 1726867265.22908: variable '__network_packages_default_initscripts' from source: role '' defaults 15247 1726867265.23127: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15247 1726867265.24285: variable 'network_connections' from source: play vars 15247 1726867265.24289: variable 'profile' from source: play vars 15247 1726867265.24326: variable 'profile' from source: play vars 15247 1726867265.24535: variable 'interface' from source: set_fact 15247 1726867265.24538: variable 'interface' from source: set_fact 15247 1726867265.24540: variable 'ansible_distribution' from source: facts 15247 1726867265.24542: variable '__network_rh_distros' from source: role '' defaults 15247 1726867265.24544: variable 'ansible_distribution_major_version' from source: facts 15247 1726867265.24546: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15247 1726867265.24998: variable 'ansible_distribution' from source: facts 15247 1726867265.25036: variable '__network_rh_distros' from source: role '' defaults 15247 1726867265.25096: variable 'ansible_distribution_major_version' from source: facts 15247 1726867265.25131: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15247 1726867265.25583: variable 'ansible_distribution' from source: facts 15247 1726867265.25810: variable '__network_rh_distros' from source: role '' defaults 15247 1726867265.25813: variable 'ansible_distribution_major_version' from source: facts 15247 1726867265.25816: variable 'network_provider' from source: set_fact 15247 1726867265.25818: variable 'ansible_facts' from source: unknown 15247 1726867265.26833: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 15247 1726867265.26842: when evaluation is False, skipping this task 15247 1726867265.26851: _execute() done 15247 1726867265.26860: dumping result to json 15247 1726867265.26869: done dumping result, returning 15247 1726867265.26886: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-8ce3-1923-000000000062] 15247 1726867265.26904: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000062 skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 15247 1726867265.27138: no more pending results, returning what we have 15247 1726867265.27142: results queue empty 15247 1726867265.27143: checking for any_errors_fatal 15247 1726867265.27153: done checking for any_errors_fatal 15247 1726867265.27154: checking for max_fail_percentage 15247 1726867265.27155: done checking for max_fail_percentage 15247 1726867265.27156: checking to see if all hosts have failed and the running result is not ok 15247 1726867265.27157: done checking to see if all hosts have failed 15247 1726867265.27158: getting the remaining hosts for this loop 15247 1726867265.27160: done getting the remaining hosts for this loop 15247 1726867265.27163: getting the next task for host managed_node2 15247 1726867265.27171: done getting next task for host managed_node2 15247 1726867265.27175: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15247 1726867265.27179: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867265.27194: getting variables 15247 1726867265.27196: in VariableManager get_vars() 15247 1726867265.27244: Calling all_inventory to load vars for managed_node2 15247 1726867265.27250: Calling groups_inventory to load vars for managed_node2 15247 1726867265.27255: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867265.27267: Calling all_plugins_play to load vars for managed_node2 15247 1726867265.27288: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000062 15247 1726867265.27565: WORKER PROCESS EXITING 15247 1726867265.27549: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867265.27572: Calling groups_plugins_play to load vars for managed_node2 15247 1726867265.31917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867265.34300: done with get_vars() 15247 1726867265.34330: done getting variables 15247 1726867265.34399: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:21:05 -0400 (0:00:00.236) 0:00:35.053 ****** 15247 1726867265.34435: entering _queue_task() for managed_node2/package 15247 1726867265.34768: worker is 1 (out of 1 available) 15247 1726867265.34902: exiting _queue_task() for managed_node2/package 15247 1726867265.34915: done queuing things up, now waiting for results queue to drain 15247 1726867265.34916: waiting for pending results... 15247 1726867265.35065: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 15247 1726867265.35159: in run() - task 0affcac9-a3a5-8ce3-1923-000000000063 15247 1726867265.35172: variable 'ansible_search_path' from source: unknown 15247 1726867265.35179: variable 'ansible_search_path' from source: unknown 15247 1726867265.35207: calling self._execute() 15247 1726867265.35286: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867265.35290: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867265.35301: variable 'omit' from source: magic vars 15247 1726867265.35749: variable 'ansible_distribution_major_version' from source: facts 15247 1726867265.35801: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867265.36095: variable 'network_state' from source: role '' defaults 15247 1726867265.36098: Evaluated conditional (network_state != {}): False 15247 1726867265.36101: when evaluation is False, skipping this task 15247 1726867265.36103: _execute() done 15247 1726867265.36105: dumping result to json 15247 1726867265.36107: done dumping result, returning 15247 1726867265.36109: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-8ce3-1923-000000000063] 15247 1726867265.36187: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000063 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15247 1726867265.36455: no more pending results, returning what we have 15247 1726867265.36459: results queue empty 15247 1726867265.36461: checking for any_errors_fatal 15247 1726867265.36467: done checking for any_errors_fatal 15247 1726867265.36467: checking for max_fail_percentage 15247 1726867265.36469: done checking for max_fail_percentage 15247 1726867265.36470: checking to see if all hosts have failed and the running result is not ok 15247 1726867265.36470: done checking to see if all hosts have failed 15247 1726867265.36471: getting the remaining hosts for this loop 15247 1726867265.36473: done getting the remaining hosts for this loop 15247 1726867265.36478: getting the next task for host managed_node2 15247 1726867265.36485: done getting next task for host managed_node2 15247 1726867265.36493: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15247 1726867265.36495: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867265.36509: getting variables 15247 1726867265.36512: in VariableManager get_vars() 15247 1726867265.36560: Calling all_inventory to load vars for managed_node2 15247 1726867265.36563: Calling groups_inventory to load vars for managed_node2 15247 1726867265.36565: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867265.36685: Calling all_plugins_play to load vars for managed_node2 15247 1726867265.36690: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867265.36794: Calling groups_plugins_play to load vars for managed_node2 15247 1726867265.37410: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000063 15247 1726867265.37413: WORKER PROCESS EXITING 15247 1726867265.38317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867265.39824: done with get_vars() 15247 1726867265.39857: done getting variables 15247 1726867265.39927: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:21:05 -0400 (0:00:00.055) 0:00:35.109 ****** 15247 1726867265.39955: entering _queue_task() for managed_node2/package 15247 1726867265.40243: worker is 1 (out of 1 available) 15247 1726867265.40256: exiting _queue_task() for managed_node2/package 15247 1726867265.40269: done queuing things up, now waiting for results queue to drain 15247 1726867265.40270: waiting for pending results... 15247 1726867265.40457: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 15247 1726867265.40526: in run() - task 0affcac9-a3a5-8ce3-1923-000000000064 15247 1726867265.40538: variable 'ansible_search_path' from source: unknown 15247 1726867265.40542: variable 'ansible_search_path' from source: unknown 15247 1726867265.40583: calling self._execute() 15247 1726867265.40664: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867265.40669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867265.40680: variable 'omit' from source: magic vars 15247 1726867265.40959: variable 'ansible_distribution_major_version' from source: facts 15247 1726867265.40968: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867265.41067: variable 'network_state' from source: role '' defaults 15247 1726867265.41075: Evaluated conditional (network_state != {}): False 15247 1726867265.41085: when evaluation is False, skipping this task 15247 1726867265.41089: _execute() done 15247 1726867265.41091: dumping result to json 15247 1726867265.41093: done dumping result, returning 15247 1726867265.41096: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-8ce3-1923-000000000064] 15247 1726867265.41112: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000064 15247 1726867265.41267: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000064 15247 1726867265.41270: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15247 1726867265.41341: no more pending results, returning what we have 15247 1726867265.41344: results queue empty 15247 1726867265.41345: checking for any_errors_fatal 15247 1726867265.41351: done checking for any_errors_fatal 15247 1726867265.41351: checking for max_fail_percentage 15247 1726867265.41353: done checking for max_fail_percentage 15247 1726867265.41353: checking to see if all hosts have failed and the running result is not ok 15247 1726867265.41354: done checking to see if all hosts have failed 15247 1726867265.41355: getting the remaining hosts for this loop 15247 1726867265.41356: done getting the remaining hosts for this loop 15247 1726867265.41359: getting the next task for host managed_node2 15247 1726867265.41363: done getting next task for host managed_node2 15247 1726867265.41366: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15247 1726867265.41368: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867265.41382: getting variables 15247 1726867265.41383: in VariableManager get_vars() 15247 1726867265.41425: Calling all_inventory to load vars for managed_node2 15247 1726867265.41431: Calling groups_inventory to load vars for managed_node2 15247 1726867265.41433: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867265.41442: Calling all_plugins_play to load vars for managed_node2 15247 1726867265.41446: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867265.41449: Calling groups_plugins_play to load vars for managed_node2 15247 1726867265.42605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867265.44373: done with get_vars() 15247 1726867265.44419: done getting variables 15247 1726867265.44470: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:21:05 -0400 (0:00:00.045) 0:00:35.154 ****** 15247 1726867265.44498: entering _queue_task() for managed_node2/service 15247 1726867265.44763: worker is 1 (out of 1 available) 15247 1726867265.44788: exiting _queue_task() for managed_node2/service 15247 1726867265.44801: done queuing things up, now waiting for results queue to drain 15247 1726867265.44802: waiting for pending results... 15247 1726867265.45105: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 15247 1726867265.45175: in run() - task 0affcac9-a3a5-8ce3-1923-000000000065 15247 1726867265.45189: variable 'ansible_search_path' from source: unknown 15247 1726867265.45194: variable 'ansible_search_path' from source: unknown 15247 1726867265.45198: calling self._execute() 15247 1726867265.45346: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867265.45350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867265.45353: variable 'omit' from source: magic vars 15247 1726867265.45696: variable 'ansible_distribution_major_version' from source: facts 15247 1726867265.45700: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867265.45838: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867265.46020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867265.48018: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867265.48089: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867265.48139: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867265.48159: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867265.48184: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867265.48249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867265.48286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867265.48317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.48393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867265.48397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867265.48449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867265.48452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867265.48533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.48536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867265.48563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867265.48598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867265.48626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867265.48679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.48716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867265.48719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867265.48948: variable 'network_connections' from source: play vars 15247 1726867265.48951: variable 'profile' from source: play vars 15247 1726867265.49027: variable 'profile' from source: play vars 15247 1726867265.49030: variable 'interface' from source: set_fact 15247 1726867265.49114: variable 'interface' from source: set_fact 15247 1726867265.49203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867265.49366: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867265.49397: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867265.49443: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867265.49475: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867265.49539: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867265.49567: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867265.49603: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.49607: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867265.49692: variable '__network_team_connections_defined' from source: role '' defaults 15247 1726867265.49947: variable 'network_connections' from source: play vars 15247 1726867265.49950: variable 'profile' from source: play vars 15247 1726867265.50012: variable 'profile' from source: play vars 15247 1726867265.50044: variable 'interface' from source: set_fact 15247 1726867265.50092: variable 'interface' from source: set_fact 15247 1726867265.50131: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 15247 1726867265.50136: when evaluation is False, skipping this task 15247 1726867265.50139: _execute() done 15247 1726867265.50142: dumping result to json 15247 1726867265.50144: done dumping result, returning 15247 1726867265.50146: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-8ce3-1923-000000000065] 15247 1726867265.50160: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000065 15247 1726867265.50428: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000065 15247 1726867265.50431: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 15247 1726867265.50510: no more pending results, returning what we have 15247 1726867265.50516: results queue empty 15247 1726867265.50517: checking for any_errors_fatal 15247 1726867265.50521: done checking for any_errors_fatal 15247 1726867265.50522: checking for max_fail_percentage 15247 1726867265.50524: done checking for max_fail_percentage 15247 1726867265.50524: checking to see if all hosts have failed and the running result is not ok 15247 1726867265.50525: done checking to see if all hosts have failed 15247 1726867265.50526: getting the remaining hosts for this loop 15247 1726867265.50527: done getting the remaining hosts for this loop 15247 1726867265.50530: getting the next task for host managed_node2 15247 1726867265.50535: done getting next task for host managed_node2 15247 1726867265.50539: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15247 1726867265.50542: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867265.50556: getting variables 15247 1726867265.50557: in VariableManager get_vars() 15247 1726867265.50599: Calling all_inventory to load vars for managed_node2 15247 1726867265.50602: Calling groups_inventory to load vars for managed_node2 15247 1726867265.50605: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867265.50617: Calling all_plugins_play to load vars for managed_node2 15247 1726867265.50620: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867265.50624: Calling groups_plugins_play to load vars for managed_node2 15247 1726867265.51901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867265.53155: done with get_vars() 15247 1726867265.53184: done getting variables 15247 1726867265.53256: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:21:05 -0400 (0:00:00.087) 0:00:35.242 ****** 15247 1726867265.53287: entering _queue_task() for managed_node2/service 15247 1726867265.53539: worker is 1 (out of 1 available) 15247 1726867265.53553: exiting _queue_task() for managed_node2/service 15247 1726867265.53567: done queuing things up, now waiting for results queue to drain 15247 1726867265.53569: waiting for pending results... 15247 1726867265.53806: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 15247 1726867265.53887: in run() - task 0affcac9-a3a5-8ce3-1923-000000000066 15247 1726867265.53893: variable 'ansible_search_path' from source: unknown 15247 1726867265.53897: variable 'ansible_search_path' from source: unknown 15247 1726867265.53927: calling self._execute() 15247 1726867265.54002: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867265.54006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867265.54018: variable 'omit' from source: magic vars 15247 1726867265.54343: variable 'ansible_distribution_major_version' from source: facts 15247 1726867265.54360: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867265.54535: variable 'network_provider' from source: set_fact 15247 1726867265.54539: variable 'network_state' from source: role '' defaults 15247 1726867265.54548: Evaluated conditional (network_provider == "nm" or network_state != {}): True 15247 1726867265.54553: variable 'omit' from source: magic vars 15247 1726867265.54609: variable 'omit' from source: magic vars 15247 1726867265.54666: variable 'network_service_name' from source: role '' defaults 15247 1726867265.54722: variable 'network_service_name' from source: role '' defaults 15247 1726867265.54819: variable '__network_provider_setup' from source: role '' defaults 15247 1726867265.54824: variable '__network_service_name_default_nm' from source: role '' defaults 15247 1726867265.54894: variable '__network_service_name_default_nm' from source: role '' defaults 15247 1726867265.54904: variable '__network_packages_default_nm' from source: role '' defaults 15247 1726867265.54959: variable '__network_packages_default_nm' from source: role '' defaults 15247 1726867265.55120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867265.57057: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867265.57140: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867265.57182: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867265.57252: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867265.57267: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867265.57368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867265.57446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867265.57449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.57533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867265.57540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867265.57601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867265.57634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867265.57676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.57714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867265.57750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867265.57982: variable '__network_packages_default_gobject_packages' from source: role '' defaults 15247 1726867265.58085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867265.58101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867265.58119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.58166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867265.58169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867265.58287: variable 'ansible_python' from source: facts 15247 1726867265.58290: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 15247 1726867265.58384: variable '__network_wpa_supplicant_required' from source: role '' defaults 15247 1726867265.58429: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15247 1726867265.58547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867265.58565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867265.58583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.58627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867265.58642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867265.58698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867265.58740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867265.58766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.58809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867265.58821: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867265.58916: variable 'network_connections' from source: play vars 15247 1726867265.58919: variable 'profile' from source: play vars 15247 1726867265.58994: variable 'profile' from source: play vars 15247 1726867265.58998: variable 'interface' from source: set_fact 15247 1726867265.59073: variable 'interface' from source: set_fact 15247 1726867265.59169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867265.59332: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867265.59368: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867265.59399: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867265.59444: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867265.59502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867265.59528: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867265.59573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867265.59606: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867265.59643: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867265.59880: variable 'network_connections' from source: play vars 15247 1726867265.59886: variable 'profile' from source: play vars 15247 1726867265.59940: variable 'profile' from source: play vars 15247 1726867265.59943: variable 'interface' from source: set_fact 15247 1726867265.60002: variable 'interface' from source: set_fact 15247 1726867265.60086: variable '__network_packages_default_wireless' from source: role '' defaults 15247 1726867265.60133: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867265.60361: variable 'network_connections' from source: play vars 15247 1726867265.60364: variable 'profile' from source: play vars 15247 1726867265.60461: variable 'profile' from source: play vars 15247 1726867265.60464: variable 'interface' from source: set_fact 15247 1726867265.60524: variable 'interface' from source: set_fact 15247 1726867265.60543: variable '__network_packages_default_team' from source: role '' defaults 15247 1726867265.60601: variable '__network_team_connections_defined' from source: role '' defaults 15247 1726867265.60949: variable 'network_connections' from source: play vars 15247 1726867265.60952: variable 'profile' from source: play vars 15247 1726867265.61007: variable 'profile' from source: play vars 15247 1726867265.61011: variable 'interface' from source: set_fact 15247 1726867265.61067: variable 'interface' from source: set_fact 15247 1726867265.61107: variable '__network_service_name_default_initscripts' from source: role '' defaults 15247 1726867265.61153: variable '__network_service_name_default_initscripts' from source: role '' defaults 15247 1726867265.61159: variable '__network_packages_default_initscripts' from source: role '' defaults 15247 1726867265.61202: variable '__network_packages_default_initscripts' from source: role '' defaults 15247 1726867265.61444: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 15247 1726867265.62005: variable 'network_connections' from source: play vars 15247 1726867265.62008: variable 'profile' from source: play vars 15247 1726867265.62051: variable 'profile' from source: play vars 15247 1726867265.62055: variable 'interface' from source: set_fact 15247 1726867265.62139: variable 'interface' from source: set_fact 15247 1726867265.62149: variable 'ansible_distribution' from source: facts 15247 1726867265.62152: variable '__network_rh_distros' from source: role '' defaults 15247 1726867265.62154: variable 'ansible_distribution_major_version' from source: facts 15247 1726867265.62188: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 15247 1726867265.62364: variable 'ansible_distribution' from source: facts 15247 1726867265.62367: variable '__network_rh_distros' from source: role '' defaults 15247 1726867265.62374: variable 'ansible_distribution_major_version' from source: facts 15247 1726867265.62379: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 15247 1726867265.62514: variable 'ansible_distribution' from source: facts 15247 1726867265.62519: variable '__network_rh_distros' from source: role '' defaults 15247 1726867265.62525: variable 'ansible_distribution_major_version' from source: facts 15247 1726867265.62551: variable 'network_provider' from source: set_fact 15247 1726867265.62572: variable 'omit' from source: magic vars 15247 1726867265.62595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867265.62616: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867265.62633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867265.62646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867265.62715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867265.62718: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867265.62720: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867265.62727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867265.62896: Set connection var ansible_shell_executable to /bin/sh 15247 1726867265.62900: Set connection var ansible_connection to ssh 15247 1726867265.62902: Set connection var ansible_shell_type to sh 15247 1726867265.62904: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867265.62906: Set connection var ansible_timeout to 10 15247 1726867265.62908: Set connection var ansible_pipelining to False 15247 1726867265.62910: variable 'ansible_shell_executable' from source: unknown 15247 1726867265.62911: variable 'ansible_connection' from source: unknown 15247 1726867265.62913: variable 'ansible_module_compression' from source: unknown 15247 1726867265.62915: variable 'ansible_shell_type' from source: unknown 15247 1726867265.62917: variable 'ansible_shell_executable' from source: unknown 15247 1726867265.62919: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867265.62924: variable 'ansible_pipelining' from source: unknown 15247 1726867265.62926: variable 'ansible_timeout' from source: unknown 15247 1726867265.62928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867265.63158: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867265.63161: variable 'omit' from source: magic vars 15247 1726867265.63163: starting attempt loop 15247 1726867265.63166: running the handler 15247 1726867265.63168: variable 'ansible_facts' from source: unknown 15247 1726867265.64132: _low_level_execute_command(): starting 15247 1726867265.64139: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867265.64969: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867265.65045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867265.65049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867265.65062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867265.65073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867265.65139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867265.66884: stdout chunk (state=3): >>>/root <<< 15247 1726867265.67024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867265.67197: stderr chunk (state=3): >>><<< 15247 1726867265.67200: stdout chunk (state=3): >>><<< 15247 1726867265.67399: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867265.67418: _low_level_execute_command(): starting 15247 1726867265.67424: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867265.6740239-16883-5710256953606 `" && echo ansible-tmp-1726867265.6740239-16883-5710256953606="` echo /root/.ansible/tmp/ansible-tmp-1726867265.6740239-16883-5710256953606 `" ) && sleep 0' 15247 1726867265.68455: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867265.68460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867265.68463: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867265.68465: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867265.68467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867265.68469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867265.68755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867265.70604: stdout chunk (state=3): >>>ansible-tmp-1726867265.6740239-16883-5710256953606=/root/.ansible/tmp/ansible-tmp-1726867265.6740239-16883-5710256953606 <<< 15247 1726867265.70753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867265.70757: stdout chunk (state=3): >>><<< 15247 1726867265.70764: stderr chunk (state=3): >>><<< 15247 1726867265.70782: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867265.6740239-16883-5710256953606=/root/.ansible/tmp/ansible-tmp-1726867265.6740239-16883-5710256953606 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867265.70815: variable 'ansible_module_compression' from source: unknown 15247 1726867265.70870: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 15247 1726867265.70939: variable 'ansible_facts' from source: unknown 15247 1726867265.71536: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867265.6740239-16883-5710256953606/AnsiballZ_systemd.py 15247 1726867265.71896: Sending initial data 15247 1726867265.71900: Sent initial data (154 bytes) 15247 1726867265.72855: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867265.72906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867265.72941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867265.73204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867265.73275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867265.75107: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867265.75270: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867265.75335: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpscmlrmzv /root/.ansible/tmp/ansible-tmp-1726867265.6740239-16883-5710256953606/AnsiballZ_systemd.py <<< 15247 1726867265.75339: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867265.6740239-16883-5710256953606/AnsiballZ_systemd.py" <<< 15247 1726867265.75374: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpscmlrmzv" to remote "/root/.ansible/tmp/ansible-tmp-1726867265.6740239-16883-5710256953606/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867265.6740239-16883-5710256953606/AnsiballZ_systemd.py" <<< 15247 1726867265.78320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867265.78391: stderr chunk (state=3): >>><<< 15247 1726867265.78394: stdout chunk (state=3): >>><<< 15247 1726867265.78456: done transferring module to remote 15247 1726867265.78485: _low_level_execute_command(): starting 15247 1726867265.78489: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867265.6740239-16883-5710256953606/ /root/.ansible/tmp/ansible-tmp-1726867265.6740239-16883-5710256953606/AnsiballZ_systemd.py && sleep 0' 15247 1726867265.79315: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867265.79334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867265.81183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867265.81216: stderr chunk (state=3): >>><<< 15247 1726867265.81233: stdout chunk (state=3): >>><<< 15247 1726867265.81238: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867265.81241: _low_level_execute_command(): starting 15247 1726867265.81248: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867265.6740239-16883-5710256953606/AnsiballZ_systemd.py && sleep 0' 15247 1726867265.82344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867265.82582: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867265.82648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867265.82767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867266.11985: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4472832", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3307831296", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "760438000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 15247 1726867266.13826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867266.13844: stderr chunk (state=3): >>>Shared connection to 10.31.12.116 closed. <<< 15247 1726867266.14018: stderr chunk (state=3): >>><<< 15247 1726867266.14022: stdout chunk (state=3): >>><<< 15247 1726867266.14041: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6928", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainStartTimestampMonotonic": "284277161", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ExecMainHandoffTimestampMonotonic": "284292999", "ExecMainPID": "6928", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4195", "MemoryCurrent": "4472832", "MemoryPeak": "8298496", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3307831296", "EffectiveMemoryMax": "3702870016", "EffectiveMemoryHigh": "3702870016", "CPUUsageNSec": "760438000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target multi-user.target shutdown.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service system.slice network-pre.target dbus.socket sysinit.target systemd-journald.socket cloud-init-local.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:18 EDT", "StateChangeTimestampMonotonic": "396930889", "InactiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveExitTimestampMonotonic": "284278359", "ActiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveEnterTimestampMonotonic": "284371120", "ActiveExitTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ActiveExitTimestampMonotonic": "284248566", "InactiveEnterTimestamp": "Fri 2024-09-20 17:17:26 EDT", "InactiveEnterTimestampMonotonic": "284273785", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:17:26 EDT", "ConditionTimestampMonotonic": "284275676", "AssertTimestamp": "Fri 2024-09-20 17:17:26 EDT", "AssertTimestampMonotonic": "284275682", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4565dcb3a30f406b9973d652f75a5d4f", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867266.14484: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867265.6740239-16883-5710256953606/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867266.14488: _low_level_execute_command(): starting 15247 1726867266.14490: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867265.6740239-16883-5710256953606/ > /dev/null 2>&1 && sleep 0' 15247 1726867266.15725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867266.15986: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867266.16298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867266.16355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867266.18245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867266.18250: stdout chunk (state=3): >>><<< 15247 1726867266.18253: stderr chunk (state=3): >>><<< 15247 1726867266.18271: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867266.18285: handler run complete 15247 1726867266.18357: attempt loop complete, returning result 15247 1726867266.18439: _execute() done 15247 1726867266.18445: dumping result to json 15247 1726867266.18464: done dumping result, returning 15247 1726867266.18479: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-8ce3-1923-000000000066] 15247 1726867266.18490: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000066 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15247 1726867266.19051: no more pending results, returning what we have 15247 1726867266.19055: results queue empty 15247 1726867266.19056: checking for any_errors_fatal 15247 1726867266.19064: done checking for any_errors_fatal 15247 1726867266.19065: checking for max_fail_percentage 15247 1726867266.19067: done checking for max_fail_percentage 15247 1726867266.19067: checking to see if all hosts have failed and the running result is not ok 15247 1726867266.19068: done checking to see if all hosts have failed 15247 1726867266.19069: getting the remaining hosts for this loop 15247 1726867266.19070: done getting the remaining hosts for this loop 15247 1726867266.19075: getting the next task for host managed_node2 15247 1726867266.19084: done getting next task for host managed_node2 15247 1726867266.19088: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15247 1726867266.19090: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867266.19100: getting variables 15247 1726867266.19102: in VariableManager get_vars() 15247 1726867266.19165: Calling all_inventory to load vars for managed_node2 15247 1726867266.19168: Calling groups_inventory to load vars for managed_node2 15247 1726867266.19170: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867266.19289: Calling all_plugins_play to load vars for managed_node2 15247 1726867266.19294: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867266.19298: Calling groups_plugins_play to load vars for managed_node2 15247 1726867266.19818: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000066 15247 1726867266.20384: WORKER PROCESS EXITING 15247 1726867266.21997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867266.25501: done with get_vars() 15247 1726867266.25533: done getting variables 15247 1726867266.25710: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:21:06 -0400 (0:00:00.724) 0:00:35.967 ****** 15247 1726867266.25745: entering _queue_task() for managed_node2/service 15247 1726867266.26360: worker is 1 (out of 1 available) 15247 1726867266.26373: exiting _queue_task() for managed_node2/service 15247 1726867266.26598: done queuing things up, now waiting for results queue to drain 15247 1726867266.26600: waiting for pending results... 15247 1726867266.26842: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 15247 1726867266.27049: in run() - task 0affcac9-a3a5-8ce3-1923-000000000067 15247 1726867266.27140: variable 'ansible_search_path' from source: unknown 15247 1726867266.27150: variable 'ansible_search_path' from source: unknown 15247 1726867266.27195: calling self._execute() 15247 1726867266.27433: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867266.27455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867266.27664: variable 'omit' from source: magic vars 15247 1726867266.28254: variable 'ansible_distribution_major_version' from source: facts 15247 1726867266.28329: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867266.28553: variable 'network_provider' from source: set_fact 15247 1726867266.28563: Evaluated conditional (network_provider == "nm"): True 15247 1726867266.28767: variable '__network_wpa_supplicant_required' from source: role '' defaults 15247 1726867266.29076: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 15247 1726867266.29249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867266.34063: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867266.34166: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867266.34256: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867266.34356: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867266.34447: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867266.34669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867266.34707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867266.34737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867266.34964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867266.34967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867266.34970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867266.35202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867266.35205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867266.35208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867266.35210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867266.35484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867266.35518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867266.35548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867266.35595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867266.35615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867266.36144: variable 'network_connections' from source: play vars 15247 1726867266.36147: variable 'profile' from source: play vars 15247 1726867266.36266: variable 'profile' from source: play vars 15247 1726867266.36336: variable 'interface' from source: set_fact 15247 1726867266.36463: variable 'interface' from source: set_fact 15247 1726867266.36533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 15247 1726867266.37029: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 15247 1726867266.37068: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 15247 1726867266.37183: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 15247 1726867266.37186: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 15247 1726867266.37189: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 15247 1726867266.37220: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 15247 1726867266.37280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867266.37341: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 15247 1726867266.37398: variable '__network_wireless_connections_defined' from source: role '' defaults 15247 1726867266.37659: variable 'network_connections' from source: play vars 15247 1726867266.37672: variable 'profile' from source: play vars 15247 1726867266.37738: variable 'profile' from source: play vars 15247 1726867266.37755: variable 'interface' from source: set_fact 15247 1726867266.37818: variable 'interface' from source: set_fact 15247 1726867266.37858: Evaluated conditional (__network_wpa_supplicant_required): False 15247 1726867266.37867: when evaluation is False, skipping this task 15247 1726867266.37875: _execute() done 15247 1726867266.37966: dumping result to json 15247 1726867266.37970: done dumping result, returning 15247 1726867266.37972: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-8ce3-1923-000000000067] 15247 1726867266.37975: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000067 15247 1726867266.38044: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000067 15247 1726867266.38049: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 15247 1726867266.38101: no more pending results, returning what we have 15247 1726867266.38104: results queue empty 15247 1726867266.38105: checking for any_errors_fatal 15247 1726867266.38121: done checking for any_errors_fatal 15247 1726867266.38122: checking for max_fail_percentage 15247 1726867266.38124: done checking for max_fail_percentage 15247 1726867266.38124: checking to see if all hosts have failed and the running result is not ok 15247 1726867266.38125: done checking to see if all hosts have failed 15247 1726867266.38126: getting the remaining hosts for this loop 15247 1726867266.38127: done getting the remaining hosts for this loop 15247 1726867266.38130: getting the next task for host managed_node2 15247 1726867266.38137: done getting next task for host managed_node2 15247 1726867266.38140: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 15247 1726867266.38142: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867266.38154: getting variables 15247 1726867266.38155: in VariableManager get_vars() 15247 1726867266.38199: Calling all_inventory to load vars for managed_node2 15247 1726867266.38202: Calling groups_inventory to load vars for managed_node2 15247 1726867266.38204: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867266.38216: Calling all_plugins_play to load vars for managed_node2 15247 1726867266.38219: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867266.38222: Calling groups_plugins_play to load vars for managed_node2 15247 1726867266.41201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867266.43069: done with get_vars() 15247 1726867266.43096: done getting variables 15247 1726867266.43161: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:21:06 -0400 (0:00:00.179) 0:00:36.146 ****** 15247 1726867266.43681: entering _queue_task() for managed_node2/service 15247 1726867266.44854: worker is 1 (out of 1 available) 15247 1726867266.44872: exiting _queue_task() for managed_node2/service 15247 1726867266.44887: done queuing things up, now waiting for results queue to drain 15247 1726867266.44889: waiting for pending results... 15247 1726867266.45425: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 15247 1726867266.45737: in run() - task 0affcac9-a3a5-8ce3-1923-000000000068 15247 1726867266.45741: variable 'ansible_search_path' from source: unknown 15247 1726867266.45744: variable 'ansible_search_path' from source: unknown 15247 1726867266.45746: calling self._execute() 15247 1726867266.45882: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867266.45948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867266.45972: variable 'omit' from source: magic vars 15247 1726867266.46615: variable 'ansible_distribution_major_version' from source: facts 15247 1726867266.46618: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867266.46736: variable 'network_provider' from source: set_fact 15247 1726867266.46832: Evaluated conditional (network_provider == "initscripts"): False 15247 1726867266.46835: when evaluation is False, skipping this task 15247 1726867266.46837: _execute() done 15247 1726867266.46839: dumping result to json 15247 1726867266.46841: done dumping result, returning 15247 1726867266.46843: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-8ce3-1923-000000000068] 15247 1726867266.46845: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000068 15247 1726867266.46918: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000068 15247 1726867266.46922: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 15247 1726867266.46980: no more pending results, returning what we have 15247 1726867266.46984: results queue empty 15247 1726867266.46985: checking for any_errors_fatal 15247 1726867266.46996: done checking for any_errors_fatal 15247 1726867266.46997: checking for max_fail_percentage 15247 1726867266.46999: done checking for max_fail_percentage 15247 1726867266.47000: checking to see if all hosts have failed and the running result is not ok 15247 1726867266.47001: done checking to see if all hosts have failed 15247 1726867266.47002: getting the remaining hosts for this loop 15247 1726867266.47003: done getting the remaining hosts for this loop 15247 1726867266.47007: getting the next task for host managed_node2 15247 1726867266.47015: done getting next task for host managed_node2 15247 1726867266.47019: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15247 1726867266.47022: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867266.47036: getting variables 15247 1726867266.47038: in VariableManager get_vars() 15247 1726867266.47079: Calling all_inventory to load vars for managed_node2 15247 1726867266.47081: Calling groups_inventory to load vars for managed_node2 15247 1726867266.47084: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867266.47097: Calling all_plugins_play to load vars for managed_node2 15247 1726867266.47101: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867266.47104: Calling groups_plugins_play to load vars for managed_node2 15247 1726867266.48723: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867266.50637: done with get_vars() 15247 1726867266.50675: done getting variables 15247 1726867266.50769: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:21:06 -0400 (0:00:00.071) 0:00:36.217 ****** 15247 1726867266.50807: entering _queue_task() for managed_node2/copy 15247 1726867266.51171: worker is 1 (out of 1 available) 15247 1726867266.51184: exiting _queue_task() for managed_node2/copy 15247 1726867266.51204: done queuing things up, now waiting for results queue to drain 15247 1726867266.51206: waiting for pending results... 15247 1726867266.51893: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 15247 1726867266.51929: in run() - task 0affcac9-a3a5-8ce3-1923-000000000069 15247 1726867266.51952: variable 'ansible_search_path' from source: unknown 15247 1726867266.51961: variable 'ansible_search_path' from source: unknown 15247 1726867266.52004: calling self._execute() 15247 1726867266.52103: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867266.52119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867266.52136: variable 'omit' from source: magic vars 15247 1726867266.52516: variable 'ansible_distribution_major_version' from source: facts 15247 1726867266.52535: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867266.52885: variable 'network_provider' from source: set_fact 15247 1726867266.52893: Evaluated conditional (network_provider == "initscripts"): False 15247 1726867266.52896: when evaluation is False, skipping this task 15247 1726867266.52899: _execute() done 15247 1726867266.52903: dumping result to json 15247 1726867266.52905: done dumping result, returning 15247 1726867266.52909: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-8ce3-1923-000000000069] 15247 1726867266.52911: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000069 15247 1726867266.52984: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000069 15247 1726867266.52988: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 15247 1726867266.53031: no more pending results, returning what we have 15247 1726867266.53034: results queue empty 15247 1726867266.53035: checking for any_errors_fatal 15247 1726867266.53040: done checking for any_errors_fatal 15247 1726867266.53040: checking for max_fail_percentage 15247 1726867266.53042: done checking for max_fail_percentage 15247 1726867266.53043: checking to see if all hosts have failed and the running result is not ok 15247 1726867266.53043: done checking to see if all hosts have failed 15247 1726867266.53044: getting the remaining hosts for this loop 15247 1726867266.53045: done getting the remaining hosts for this loop 15247 1726867266.53106: getting the next task for host managed_node2 15247 1726867266.53112: done getting next task for host managed_node2 15247 1726867266.53118: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15247 1726867266.53120: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867266.53132: getting variables 15247 1726867266.53134: in VariableManager get_vars() 15247 1726867266.53166: Calling all_inventory to load vars for managed_node2 15247 1726867266.53168: Calling groups_inventory to load vars for managed_node2 15247 1726867266.53171: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867266.53182: Calling all_plugins_play to load vars for managed_node2 15247 1726867266.53185: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867266.53188: Calling groups_plugins_play to load vars for managed_node2 15247 1726867266.58422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867266.59982: done with get_vars() 15247 1726867266.60011: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:21:06 -0400 (0:00:00.092) 0:00:36.310 ****** 15247 1726867266.60097: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 15247 1726867266.60472: worker is 1 (out of 1 available) 15247 1726867266.60686: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 15247 1726867266.60698: done queuing things up, now waiting for results queue to drain 15247 1726867266.60700: waiting for pending results... 15247 1726867266.60817: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 15247 1726867266.60963: in run() - task 0affcac9-a3a5-8ce3-1923-00000000006a 15247 1726867266.60989: variable 'ansible_search_path' from source: unknown 15247 1726867266.60998: variable 'ansible_search_path' from source: unknown 15247 1726867266.61045: calling self._execute() 15247 1726867266.61158: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867266.61171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867266.61188: variable 'omit' from source: magic vars 15247 1726867266.61590: variable 'ansible_distribution_major_version' from source: facts 15247 1726867266.61692: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867266.61696: variable 'omit' from source: magic vars 15247 1726867266.61699: variable 'omit' from source: magic vars 15247 1726867266.61836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 15247 1726867266.63990: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 15247 1726867266.64071: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 15247 1726867266.64117: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 15247 1726867266.64157: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 15247 1726867266.64292: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 15247 1726867266.64295: variable 'network_provider' from source: set_fact 15247 1726867266.64419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 15247 1726867266.64453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 15247 1726867266.64489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 15247 1726867266.64539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 15247 1726867266.64559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 15247 1726867266.64642: variable 'omit' from source: magic vars 15247 1726867266.64776: variable 'omit' from source: magic vars 15247 1726867266.64895: variable 'network_connections' from source: play vars 15247 1726867266.64919: variable 'profile' from source: play vars 15247 1726867266.64994: variable 'profile' from source: play vars 15247 1726867266.65004: variable 'interface' from source: set_fact 15247 1726867266.65071: variable 'interface' from source: set_fact 15247 1726867266.65226: variable 'omit' from source: magic vars 15247 1726867266.65243: variable '__lsr_ansible_managed' from source: task vars 15247 1726867266.65381: variable '__lsr_ansible_managed' from source: task vars 15247 1726867266.65524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 15247 1726867266.65771: Loaded config def from plugin (lookup/template) 15247 1726867266.65788: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 15247 1726867266.65826: File lookup term: get_ansible_managed.j2 15247 1726867266.65834: variable 'ansible_search_path' from source: unknown 15247 1726867266.65843: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 15247 1726867266.65861: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 15247 1726867266.65892: variable 'ansible_search_path' from source: unknown 15247 1726867266.72400: variable 'ansible_managed' from source: unknown 15247 1726867266.72576: variable 'omit' from source: magic vars 15247 1726867266.72622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867266.72656: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867266.72705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867266.72882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867266.72885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867266.72888: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867266.72891: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867266.72893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867266.72904: Set connection var ansible_shell_executable to /bin/sh 15247 1726867266.72912: Set connection var ansible_connection to ssh 15247 1726867266.72922: Set connection var ansible_shell_type to sh 15247 1726867266.72933: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867266.72944: Set connection var ansible_timeout to 10 15247 1726867266.72954: Set connection var ansible_pipelining to False 15247 1726867266.72984: variable 'ansible_shell_executable' from source: unknown 15247 1726867266.72992: variable 'ansible_connection' from source: unknown 15247 1726867266.72999: variable 'ansible_module_compression' from source: unknown 15247 1726867266.73010: variable 'ansible_shell_type' from source: unknown 15247 1726867266.73021: variable 'ansible_shell_executable' from source: unknown 15247 1726867266.73028: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867266.73036: variable 'ansible_pipelining' from source: unknown 15247 1726867266.73042: variable 'ansible_timeout' from source: unknown 15247 1726867266.73050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867266.73224: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867266.73235: variable 'omit' from source: magic vars 15247 1726867266.73237: starting attempt loop 15247 1726867266.73239: running the handler 15247 1726867266.73241: _low_level_execute_command(): starting 15247 1726867266.73253: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867266.73948: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867266.73999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867266.74019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867266.74099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867266.74111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867266.74194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867266.75882: stdout chunk (state=3): >>>/root <<< 15247 1726867266.76034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867266.76116: stderr chunk (state=3): >>><<< 15247 1726867266.76120: stdout chunk (state=3): >>><<< 15247 1726867266.76263: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867266.76266: _low_level_execute_command(): starting 15247 1726867266.76269: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867266.761642-16938-189407766860360 `" && echo ansible-tmp-1726867266.761642-16938-189407766860360="` echo /root/.ansible/tmp/ansible-tmp-1726867266.761642-16938-189407766860360 `" ) && sleep 0' 15247 1726867266.77047: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867266.77051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867266.77156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867266.77197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867266.77232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867266.79232: stdout chunk (state=3): >>>ansible-tmp-1726867266.761642-16938-189407766860360=/root/.ansible/tmp/ansible-tmp-1726867266.761642-16938-189407766860360 <<< 15247 1726867266.79335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867266.79382: stderr chunk (state=3): >>><<< 15247 1726867266.79385: stdout chunk (state=3): >>><<< 15247 1726867266.79405: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867266.761642-16938-189407766860360=/root/.ansible/tmp/ansible-tmp-1726867266.761642-16938-189407766860360 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867266.79456: variable 'ansible_module_compression' from source: unknown 15247 1726867266.79494: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 15247 1726867266.79530: variable 'ansible_facts' from source: unknown 15247 1726867266.79610: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867266.761642-16938-189407766860360/AnsiballZ_network_connections.py 15247 1726867266.79727: Sending initial data 15247 1726867266.79731: Sent initial data (167 bytes) 15247 1726867266.80379: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867266.80385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867266.80433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867266.80533: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867266.82189: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867266.82292: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867266.82305: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpidxes8pn /root/.ansible/tmp/ansible-tmp-1726867266.761642-16938-189407766860360/AnsiballZ_network_connections.py <<< 15247 1726867266.82338: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867266.761642-16938-189407766860360/AnsiballZ_network_connections.py" <<< 15247 1726867266.82366: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpidxes8pn" to remote "/root/.ansible/tmp/ansible-tmp-1726867266.761642-16938-189407766860360/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867266.761642-16938-189407766860360/AnsiballZ_network_connections.py" <<< 15247 1726867266.83092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867266.83131: stderr chunk (state=3): >>><<< 15247 1726867266.83145: stdout chunk (state=3): >>><<< 15247 1726867266.83159: done transferring module to remote 15247 1726867266.83168: _low_level_execute_command(): starting 15247 1726867266.83172: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867266.761642-16938-189407766860360/ /root/.ansible/tmp/ansible-tmp-1726867266.761642-16938-189407766860360/AnsiballZ_network_connections.py && sleep 0' 15247 1726867266.83570: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867266.83603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867266.83606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867266.83611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867266.83617: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867266.83619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867266.83621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867266.83666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867266.83669: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867266.83710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867266.85530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867266.85552: stderr chunk (state=3): >>><<< 15247 1726867266.85555: stdout chunk (state=3): >>><<< 15247 1726867266.85564: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867266.85582: _low_level_execute_command(): starting 15247 1726867266.85585: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867266.761642-16938-189407766860360/AnsiballZ_network_connections.py && sleep 0' 15247 1726867266.85963: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867266.85967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867266.85996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867266.85999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15247 1726867266.86001: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867266.86003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867266.86047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867266.86066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867266.86117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867267.13079: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_x5_5tjmm/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_x5_5tjmm/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/0dcbd6e7-6c97-413c-bbd2-1db5f36f2a65: error=unknown <<< 15247 1726867267.13249: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 15247 1726867267.15092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867267.15112: stderr chunk (state=3): >>><<< 15247 1726867267.15115: stdout chunk (state=3): >>><<< 15247 1726867267.15132: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_x5_5tjmm/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_x5_5tjmm/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/0dcbd6e7-6c97-413c-bbd2-1db5f36f2a65: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867267.15158: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867266.761642-16938-189407766860360/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867267.15168: _low_level_execute_command(): starting 15247 1726867267.15171: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867266.761642-16938-189407766860360/ > /dev/null 2>&1 && sleep 0' 15247 1726867267.15588: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867267.15591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867267.15594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 15247 1726867267.15598: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867267.15600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867267.15646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867267.15649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867267.15700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867267.17515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867267.17537: stderr chunk (state=3): >>><<< 15247 1726867267.17541: stdout chunk (state=3): >>><<< 15247 1726867267.17554: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867267.17560: handler run complete 15247 1726867267.17583: attempt loop complete, returning result 15247 1726867267.17586: _execute() done 15247 1726867267.17589: dumping result to json 15247 1726867267.17591: done dumping result, returning 15247 1726867267.17599: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-8ce3-1923-00000000006a] 15247 1726867267.17605: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000006a 15247 1726867267.17708: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000006a 15247 1726867267.17710: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 15247 1726867267.17802: no more pending results, returning what we have 15247 1726867267.17805: results queue empty 15247 1726867267.17806: checking for any_errors_fatal 15247 1726867267.17817: done checking for any_errors_fatal 15247 1726867267.17817: checking for max_fail_percentage 15247 1726867267.17819: done checking for max_fail_percentage 15247 1726867267.17820: checking to see if all hosts have failed and the running result is not ok 15247 1726867267.17829: done checking to see if all hosts have failed 15247 1726867267.17830: getting the remaining hosts for this loop 15247 1726867267.17831: done getting the remaining hosts for this loop 15247 1726867267.17835: getting the next task for host managed_node2 15247 1726867267.17840: done getting next task for host managed_node2 15247 1726867267.17844: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 15247 1726867267.17846: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867267.17855: getting variables 15247 1726867267.17857: in VariableManager get_vars() 15247 1726867267.17892: Calling all_inventory to load vars for managed_node2 15247 1726867267.17895: Calling groups_inventory to load vars for managed_node2 15247 1726867267.17897: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867267.17905: Calling all_plugins_play to load vars for managed_node2 15247 1726867267.17908: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867267.17910: Calling groups_plugins_play to load vars for managed_node2 15247 1726867267.18953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867267.20441: done with get_vars() 15247 1726867267.20457: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:21:07 -0400 (0:00:00.604) 0:00:36.914 ****** 15247 1726867267.20518: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 15247 1726867267.20754: worker is 1 (out of 1 available) 15247 1726867267.20769: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 15247 1726867267.20784: done queuing things up, now waiting for results queue to drain 15247 1726867267.20785: waiting for pending results... 15247 1726867267.20964: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 15247 1726867267.21049: in run() - task 0affcac9-a3a5-8ce3-1923-00000000006b 15247 1726867267.21061: variable 'ansible_search_path' from source: unknown 15247 1726867267.21064: variable 'ansible_search_path' from source: unknown 15247 1726867267.21095: calling self._execute() 15247 1726867267.21170: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867267.21179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867267.21188: variable 'omit' from source: magic vars 15247 1726867267.21465: variable 'ansible_distribution_major_version' from source: facts 15247 1726867267.21474: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867267.21562: variable 'network_state' from source: role '' defaults 15247 1726867267.21573: Evaluated conditional (network_state != {}): False 15247 1726867267.21576: when evaluation is False, skipping this task 15247 1726867267.21582: _execute() done 15247 1726867267.21584: dumping result to json 15247 1726867267.21587: done dumping result, returning 15247 1726867267.21589: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-8ce3-1923-00000000006b] 15247 1726867267.21596: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000006b skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 15247 1726867267.21732: no more pending results, returning what we have 15247 1726867267.21736: results queue empty 15247 1726867267.21737: checking for any_errors_fatal 15247 1726867267.21748: done checking for any_errors_fatal 15247 1726867267.21748: checking for max_fail_percentage 15247 1726867267.21750: done checking for max_fail_percentage 15247 1726867267.21751: checking to see if all hosts have failed and the running result is not ok 15247 1726867267.21752: done checking to see if all hosts have failed 15247 1726867267.21753: getting the remaining hosts for this loop 15247 1726867267.21754: done getting the remaining hosts for this loop 15247 1726867267.21758: getting the next task for host managed_node2 15247 1726867267.21763: done getting next task for host managed_node2 15247 1726867267.21767: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15247 1726867267.21769: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867267.21785: getting variables 15247 1726867267.21788: in VariableManager get_vars() 15247 1726867267.21820: Calling all_inventory to load vars for managed_node2 15247 1726867267.21822: Calling groups_inventory to load vars for managed_node2 15247 1726867267.21824: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867267.21833: Calling all_plugins_play to load vars for managed_node2 15247 1726867267.21835: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867267.21838: Calling groups_plugins_play to load vars for managed_node2 15247 1726867267.22391: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000006b 15247 1726867267.22394: WORKER PROCESS EXITING 15247 1726867267.23101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867267.23997: done with get_vars() 15247 1726867267.24011: done getting variables 15247 1726867267.24055: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:21:07 -0400 (0:00:00.035) 0:00:36.950 ****** 15247 1726867267.24083: entering _queue_task() for managed_node2/debug 15247 1726867267.24299: worker is 1 (out of 1 available) 15247 1726867267.24316: exiting _queue_task() for managed_node2/debug 15247 1726867267.24328: done queuing things up, now waiting for results queue to drain 15247 1726867267.24330: waiting for pending results... 15247 1726867267.24497: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 15247 1726867267.24567: in run() - task 0affcac9-a3a5-8ce3-1923-00000000006c 15247 1726867267.24581: variable 'ansible_search_path' from source: unknown 15247 1726867267.24585: variable 'ansible_search_path' from source: unknown 15247 1726867267.24611: calling self._execute() 15247 1726867267.24681: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867267.24688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867267.24696: variable 'omit' from source: magic vars 15247 1726867267.24970: variable 'ansible_distribution_major_version' from source: facts 15247 1726867267.24981: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867267.24991: variable 'omit' from source: magic vars 15247 1726867267.25020: variable 'omit' from source: magic vars 15247 1726867267.25046: variable 'omit' from source: magic vars 15247 1726867267.25076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867267.25111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867267.25124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867267.25143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867267.25157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867267.25189: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867267.25216: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867267.25221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867267.25302: Set connection var ansible_shell_executable to /bin/sh 15247 1726867267.25305: Set connection var ansible_connection to ssh 15247 1726867267.25307: Set connection var ansible_shell_type to sh 15247 1726867267.25309: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867267.25321: Set connection var ansible_timeout to 10 15247 1726867267.25324: Set connection var ansible_pipelining to False 15247 1726867267.25351: variable 'ansible_shell_executable' from source: unknown 15247 1726867267.25355: variable 'ansible_connection' from source: unknown 15247 1726867267.25358: variable 'ansible_module_compression' from source: unknown 15247 1726867267.25360: variable 'ansible_shell_type' from source: unknown 15247 1726867267.25363: variable 'ansible_shell_executable' from source: unknown 15247 1726867267.25368: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867267.25370: variable 'ansible_pipelining' from source: unknown 15247 1726867267.25372: variable 'ansible_timeout' from source: unknown 15247 1726867267.25383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867267.25541: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867267.25585: variable 'omit' from source: magic vars 15247 1726867267.25597: starting attempt loop 15247 1726867267.25600: running the handler 15247 1726867267.25704: variable '__network_connections_result' from source: set_fact 15247 1726867267.25744: handler run complete 15247 1726867267.25757: attempt loop complete, returning result 15247 1726867267.25760: _execute() done 15247 1726867267.25763: dumping result to json 15247 1726867267.25766: done dumping result, returning 15247 1726867267.25773: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-8ce3-1923-00000000006c] 15247 1726867267.25839: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000006c ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 15247 1726867267.26119: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000006c 15247 1726867267.26122: WORKER PROCESS EXITING 15247 1726867267.26131: no more pending results, returning what we have 15247 1726867267.26134: results queue empty 15247 1726867267.26135: checking for any_errors_fatal 15247 1726867267.26138: done checking for any_errors_fatal 15247 1726867267.26139: checking for max_fail_percentage 15247 1726867267.26140: done checking for max_fail_percentage 15247 1726867267.26141: checking to see if all hosts have failed and the running result is not ok 15247 1726867267.26142: done checking to see if all hosts have failed 15247 1726867267.26143: getting the remaining hosts for this loop 15247 1726867267.26144: done getting the remaining hosts for this loop 15247 1726867267.26147: getting the next task for host managed_node2 15247 1726867267.26151: done getting next task for host managed_node2 15247 1726867267.26154: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15247 1726867267.26155: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867267.26162: getting variables 15247 1726867267.26163: in VariableManager get_vars() 15247 1726867267.26187: Calling all_inventory to load vars for managed_node2 15247 1726867267.26189: Calling groups_inventory to load vars for managed_node2 15247 1726867267.26191: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867267.26197: Calling all_plugins_play to load vars for managed_node2 15247 1726867267.26198: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867267.26200: Calling groups_plugins_play to load vars for managed_node2 15247 1726867267.27532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867267.28872: done with get_vars() 15247 1726867267.28897: done getting variables 15247 1726867267.28945: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:21:07 -0400 (0:00:00.048) 0:00:36.999 ****** 15247 1726867267.28981: entering _queue_task() for managed_node2/debug 15247 1726867267.29250: worker is 1 (out of 1 available) 15247 1726867267.29263: exiting _queue_task() for managed_node2/debug 15247 1726867267.29275: done queuing things up, now waiting for results queue to drain 15247 1726867267.29305: waiting for pending results... 15247 1726867267.29605: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 15247 1726867267.29631: in run() - task 0affcac9-a3a5-8ce3-1923-00000000006d 15247 1726867267.29644: variable 'ansible_search_path' from source: unknown 15247 1726867267.29647: variable 'ansible_search_path' from source: unknown 15247 1726867267.29689: calling self._execute() 15247 1726867267.29766: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867267.29771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867267.29781: variable 'omit' from source: magic vars 15247 1726867267.30093: variable 'ansible_distribution_major_version' from source: facts 15247 1726867267.30103: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867267.30108: variable 'omit' from source: magic vars 15247 1726867267.30165: variable 'omit' from source: magic vars 15247 1726867267.30195: variable 'omit' from source: magic vars 15247 1726867267.30220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867267.30247: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867267.30261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867267.30273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867267.30285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867267.30311: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867267.30317: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867267.30320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867267.30451: Set connection var ansible_shell_executable to /bin/sh 15247 1726867267.30455: Set connection var ansible_connection to ssh 15247 1726867267.30458: Set connection var ansible_shell_type to sh 15247 1726867267.30460: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867267.30463: Set connection var ansible_timeout to 10 15247 1726867267.30465: Set connection var ansible_pipelining to False 15247 1726867267.30493: variable 'ansible_shell_executable' from source: unknown 15247 1726867267.30496: variable 'ansible_connection' from source: unknown 15247 1726867267.30499: variable 'ansible_module_compression' from source: unknown 15247 1726867267.30501: variable 'ansible_shell_type' from source: unknown 15247 1726867267.30504: variable 'ansible_shell_executable' from source: unknown 15247 1726867267.30506: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867267.30508: variable 'ansible_pipelining' from source: unknown 15247 1726867267.30510: variable 'ansible_timeout' from source: unknown 15247 1726867267.30512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867267.30601: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867267.30611: variable 'omit' from source: magic vars 15247 1726867267.30617: starting attempt loop 15247 1726867267.30620: running the handler 15247 1726867267.30660: variable '__network_connections_result' from source: set_fact 15247 1726867267.30712: variable '__network_connections_result' from source: set_fact 15247 1726867267.30784: handler run complete 15247 1726867267.30800: attempt loop complete, returning result 15247 1726867267.30803: _execute() done 15247 1726867267.30805: dumping result to json 15247 1726867267.30808: done dumping result, returning 15247 1726867267.30818: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-8ce3-1923-00000000006d] 15247 1726867267.30821: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000006d 15247 1726867267.30903: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000006d 15247 1726867267.30906: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 15247 1726867267.30976: no more pending results, returning what we have 15247 1726867267.30980: results queue empty 15247 1726867267.30982: checking for any_errors_fatal 15247 1726867267.30987: done checking for any_errors_fatal 15247 1726867267.30988: checking for max_fail_percentage 15247 1726867267.30989: done checking for max_fail_percentage 15247 1726867267.30990: checking to see if all hosts have failed and the running result is not ok 15247 1726867267.30991: done checking to see if all hosts have failed 15247 1726867267.30991: getting the remaining hosts for this loop 15247 1726867267.30993: done getting the remaining hosts for this loop 15247 1726867267.30996: getting the next task for host managed_node2 15247 1726867267.31000: done getting next task for host managed_node2 15247 1726867267.31003: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15247 1726867267.31005: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867267.31015: getting variables 15247 1726867267.31017: in VariableManager get_vars() 15247 1726867267.31045: Calling all_inventory to load vars for managed_node2 15247 1726867267.31047: Calling groups_inventory to load vars for managed_node2 15247 1726867267.31049: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867267.31056: Calling all_plugins_play to load vars for managed_node2 15247 1726867267.31059: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867267.31061: Calling groups_plugins_play to load vars for managed_node2 15247 1726867267.31808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867267.32700: done with get_vars() 15247 1726867267.32719: done getting variables 15247 1726867267.32756: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:21:07 -0400 (0:00:00.037) 0:00:37.037 ****** 15247 1726867267.32781: entering _queue_task() for managed_node2/debug 15247 1726867267.33000: worker is 1 (out of 1 available) 15247 1726867267.33021: exiting _queue_task() for managed_node2/debug 15247 1726867267.33033: done queuing things up, now waiting for results queue to drain 15247 1726867267.33034: waiting for pending results... 15247 1726867267.33354: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 15247 1726867267.33367: in run() - task 0affcac9-a3a5-8ce3-1923-00000000006e 15247 1726867267.33372: variable 'ansible_search_path' from source: unknown 15247 1726867267.33375: variable 'ansible_search_path' from source: unknown 15247 1726867267.33446: calling self._execute() 15247 1726867267.33500: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867267.33511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867267.33543: variable 'omit' from source: magic vars 15247 1726867267.33987: variable 'ansible_distribution_major_version' from source: facts 15247 1726867267.33990: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867267.34130: variable 'network_state' from source: role '' defaults 15247 1726867267.34134: Evaluated conditional (network_state != {}): False 15247 1726867267.34137: when evaluation is False, skipping this task 15247 1726867267.34139: _execute() done 15247 1726867267.34142: dumping result to json 15247 1726867267.34144: done dumping result, returning 15247 1726867267.34147: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-8ce3-1923-00000000006e] 15247 1726867267.34149: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000006e 15247 1726867267.34251: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000006e 15247 1726867267.34255: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 15247 1726867267.34321: no more pending results, returning what we have 15247 1726867267.34325: results queue empty 15247 1726867267.34326: checking for any_errors_fatal 15247 1726867267.34331: done checking for any_errors_fatal 15247 1726867267.34336: checking for max_fail_percentage 15247 1726867267.34337: done checking for max_fail_percentage 15247 1726867267.34338: checking to see if all hosts have failed and the running result is not ok 15247 1726867267.34339: done checking to see if all hosts have failed 15247 1726867267.34339: getting the remaining hosts for this loop 15247 1726867267.34340: done getting the remaining hosts for this loop 15247 1726867267.34344: getting the next task for host managed_node2 15247 1726867267.34348: done getting next task for host managed_node2 15247 1726867267.34351: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 15247 1726867267.34353: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867267.34373: getting variables 15247 1726867267.34375: in VariableManager get_vars() 15247 1726867267.34409: Calling all_inventory to load vars for managed_node2 15247 1726867267.34412: Calling groups_inventory to load vars for managed_node2 15247 1726867267.34416: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867267.34424: Calling all_plugins_play to load vars for managed_node2 15247 1726867267.34427: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867267.34430: Calling groups_plugins_play to load vars for managed_node2 15247 1726867267.35728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867267.36624: done with get_vars() 15247 1726867267.36638: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:21:07 -0400 (0:00:00.039) 0:00:37.076 ****** 15247 1726867267.36701: entering _queue_task() for managed_node2/ping 15247 1726867267.36922: worker is 1 (out of 1 available) 15247 1726867267.36938: exiting _queue_task() for managed_node2/ping 15247 1726867267.36953: done queuing things up, now waiting for results queue to drain 15247 1726867267.36954: waiting for pending results... 15247 1726867267.37169: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 15247 1726867267.37254: in run() - task 0affcac9-a3a5-8ce3-1923-00000000006f 15247 1726867267.37259: variable 'ansible_search_path' from source: unknown 15247 1726867267.37261: variable 'ansible_search_path' from source: unknown 15247 1726867267.37286: calling self._execute() 15247 1726867267.37345: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867267.37348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867267.37358: variable 'omit' from source: magic vars 15247 1726867267.37666: variable 'ansible_distribution_major_version' from source: facts 15247 1726867267.37674: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867267.37680: variable 'omit' from source: magic vars 15247 1726867267.37708: variable 'omit' from source: magic vars 15247 1726867267.37736: variable 'omit' from source: magic vars 15247 1726867267.37765: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867267.37795: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867267.37810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867267.37830: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867267.37833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867267.37856: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867267.37859: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867267.37862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867267.37938: Set connection var ansible_shell_executable to /bin/sh 15247 1726867267.37942: Set connection var ansible_connection to ssh 15247 1726867267.37944: Set connection var ansible_shell_type to sh 15247 1726867267.37946: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867267.37952: Set connection var ansible_timeout to 10 15247 1726867267.37956: Set connection var ansible_pipelining to False 15247 1726867267.37974: variable 'ansible_shell_executable' from source: unknown 15247 1726867267.37979: variable 'ansible_connection' from source: unknown 15247 1726867267.37982: variable 'ansible_module_compression' from source: unknown 15247 1726867267.37985: variable 'ansible_shell_type' from source: unknown 15247 1726867267.37987: variable 'ansible_shell_executable' from source: unknown 15247 1726867267.37990: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867267.37992: variable 'ansible_pipelining' from source: unknown 15247 1726867267.37996: variable 'ansible_timeout' from source: unknown 15247 1726867267.37998: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867267.38147: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867267.38159: variable 'omit' from source: magic vars 15247 1726867267.38162: starting attempt loop 15247 1726867267.38166: running the handler 15247 1726867267.38178: _low_level_execute_command(): starting 15247 1726867267.38185: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867267.38767: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867267.38772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867267.38834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867267.38839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867267.38906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867267.40583: stdout chunk (state=3): >>>/root <<< 15247 1726867267.40709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867267.40724: stderr chunk (state=3): >>><<< 15247 1726867267.40734: stdout chunk (state=3): >>><<< 15247 1726867267.40770: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867267.40773: _low_level_execute_command(): starting 15247 1726867267.40784: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867267.4076211-16966-192799777556154 `" && echo ansible-tmp-1726867267.4076211-16966-192799777556154="` echo /root/.ansible/tmp/ansible-tmp-1726867267.4076211-16966-192799777556154 `" ) && sleep 0' 15247 1726867267.41371: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867267.41374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867267.41376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15247 1726867267.41381: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867267.41389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867267.41436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867267.41441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867267.41443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867267.41484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867267.43364: stdout chunk (state=3): >>>ansible-tmp-1726867267.4076211-16966-192799777556154=/root/.ansible/tmp/ansible-tmp-1726867267.4076211-16966-192799777556154 <<< 15247 1726867267.43807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867267.43810: stdout chunk (state=3): >>><<< 15247 1726867267.43812: stderr chunk (state=3): >>><<< 15247 1726867267.43818: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867267.4076211-16966-192799777556154=/root/.ansible/tmp/ansible-tmp-1726867267.4076211-16966-192799777556154 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867267.43820: variable 'ansible_module_compression' from source: unknown 15247 1726867267.43821: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 15247 1726867267.43996: variable 'ansible_facts' from source: unknown 15247 1726867267.44184: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867267.4076211-16966-192799777556154/AnsiballZ_ping.py 15247 1726867267.44369: Sending initial data 15247 1726867267.44382: Sent initial data (153 bytes) 15247 1726867267.44737: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867267.44750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867267.44760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867267.44812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867267.44818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867267.44875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867267.46510: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867267.46548: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867267.46584: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp_6_hval1 /root/.ansible/tmp/ansible-tmp-1726867267.4076211-16966-192799777556154/AnsiballZ_ping.py <<< 15247 1726867267.46597: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867267.4076211-16966-192799777556154/AnsiballZ_ping.py" <<< 15247 1726867267.46652: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp_6_hval1" to remote "/root/.ansible/tmp/ansible-tmp-1726867267.4076211-16966-192799777556154/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867267.4076211-16966-192799777556154/AnsiballZ_ping.py" <<< 15247 1726867267.48034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867267.48237: stderr chunk (state=3): >>><<< 15247 1726867267.48240: stdout chunk (state=3): >>><<< 15247 1726867267.48242: done transferring module to remote 15247 1726867267.48244: _low_level_execute_command(): starting 15247 1726867267.48246: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867267.4076211-16966-192799777556154/ /root/.ansible/tmp/ansible-tmp-1726867267.4076211-16966-192799777556154/AnsiballZ_ping.py && sleep 0' 15247 1726867267.49393: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867267.49445: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867267.49471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867267.49556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867267.49693: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867267.49875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867267.51537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867267.51558: stderr chunk (state=3): >>><<< 15247 1726867267.51591: stdout chunk (state=3): >>><<< 15247 1726867267.51619: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867267.51864: _low_level_execute_command(): starting 15247 1726867267.51867: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867267.4076211-16966-192799777556154/AnsiballZ_ping.py && sleep 0' 15247 1726867267.52436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867267.52450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867267.52465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867267.52485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867267.52502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867267.52531: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867267.52545: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15247 1726867267.52644: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867267.52664: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867267.52746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867267.67843: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 15247 1726867267.69371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867267.69374: stdout chunk (state=3): >>><<< 15247 1726867267.69379: stderr chunk (state=3): >>><<< 15247 1726867267.69381: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867267.69384: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867267.4076211-16966-192799777556154/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867267.69386: _low_level_execute_command(): starting 15247 1726867267.69388: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867267.4076211-16966-192799777556154/ > /dev/null 2>&1 && sleep 0' 15247 1726867267.69964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867267.69981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867267.70023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867267.70040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15247 1726867267.70135: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867267.70160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867267.70231: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867267.72127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867267.72130: stdout chunk (state=3): >>><<< 15247 1726867267.72132: stderr chunk (state=3): >>><<< 15247 1726867267.72282: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867267.72290: handler run complete 15247 1726867267.72292: attempt loop complete, returning result 15247 1726867267.72294: _execute() done 15247 1726867267.72296: dumping result to json 15247 1726867267.72298: done dumping result, returning 15247 1726867267.72300: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-8ce3-1923-00000000006f] 15247 1726867267.72302: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000006f 15247 1726867267.72363: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000006f 15247 1726867267.72365: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 15247 1726867267.72425: no more pending results, returning what we have 15247 1726867267.72429: results queue empty 15247 1726867267.72430: checking for any_errors_fatal 15247 1726867267.72438: done checking for any_errors_fatal 15247 1726867267.72438: checking for max_fail_percentage 15247 1726867267.72440: done checking for max_fail_percentage 15247 1726867267.72441: checking to see if all hosts have failed and the running result is not ok 15247 1726867267.72442: done checking to see if all hosts have failed 15247 1726867267.72443: getting the remaining hosts for this loop 15247 1726867267.72444: done getting the remaining hosts for this loop 15247 1726867267.72448: getting the next task for host managed_node2 15247 1726867267.72456: done getting next task for host managed_node2 15247 1726867267.72458: ^ task is: TASK: meta (role_complete) 15247 1726867267.72460: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867267.72469: getting variables 15247 1726867267.72471: in VariableManager get_vars() 15247 1726867267.72508: Calling all_inventory to load vars for managed_node2 15247 1726867267.72510: Calling groups_inventory to load vars for managed_node2 15247 1726867267.72514: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867267.72525: Calling all_plugins_play to load vars for managed_node2 15247 1726867267.72527: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867267.72529: Calling groups_plugins_play to load vars for managed_node2 15247 1726867267.74302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867267.76222: done with get_vars() 15247 1726867267.76244: done getting variables 15247 1726867267.76330: done queuing things up, now waiting for results queue to drain 15247 1726867267.76332: results queue empty 15247 1726867267.76333: checking for any_errors_fatal 15247 1726867267.76335: done checking for any_errors_fatal 15247 1726867267.76336: checking for max_fail_percentage 15247 1726867267.76337: done checking for max_fail_percentage 15247 1726867267.76338: checking to see if all hosts have failed and the running result is not ok 15247 1726867267.76338: done checking to see if all hosts have failed 15247 1726867267.76339: getting the remaining hosts for this loop 15247 1726867267.76340: done getting the remaining hosts for this loop 15247 1726867267.76342: getting the next task for host managed_node2 15247 1726867267.76345: done getting next task for host managed_node2 15247 1726867267.76347: ^ task is: TASK: meta (flush_handlers) 15247 1726867267.76348: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867267.76351: getting variables 15247 1726867267.76352: in VariableManager get_vars() 15247 1726867267.76363: Calling all_inventory to load vars for managed_node2 15247 1726867267.76365: Calling groups_inventory to load vars for managed_node2 15247 1726867267.76367: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867267.76371: Calling all_plugins_play to load vars for managed_node2 15247 1726867267.76373: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867267.76376: Calling groups_plugins_play to load vars for managed_node2 15247 1726867267.77718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867267.79387: done with get_vars() 15247 1726867267.79407: done getting variables 15247 1726867267.79469: in VariableManager get_vars() 15247 1726867267.79482: Calling all_inventory to load vars for managed_node2 15247 1726867267.79485: Calling groups_inventory to load vars for managed_node2 15247 1726867267.79487: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867267.79492: Calling all_plugins_play to load vars for managed_node2 15247 1726867267.79495: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867267.79498: Calling groups_plugins_play to load vars for managed_node2 15247 1726867267.80696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867267.82509: done with get_vars() 15247 1726867267.82543: done queuing things up, now waiting for results queue to drain 15247 1726867267.82545: results queue empty 15247 1726867267.82546: checking for any_errors_fatal 15247 1726867267.82547: done checking for any_errors_fatal 15247 1726867267.82548: checking for max_fail_percentage 15247 1726867267.82549: done checking for max_fail_percentage 15247 1726867267.82549: checking to see if all hosts have failed and the running result is not ok 15247 1726867267.82550: done checking to see if all hosts have failed 15247 1726867267.82551: getting the remaining hosts for this loop 15247 1726867267.82552: done getting the remaining hosts for this loop 15247 1726867267.82554: getting the next task for host managed_node2 15247 1726867267.82558: done getting next task for host managed_node2 15247 1726867267.82560: ^ task is: TASK: meta (flush_handlers) 15247 1726867267.82561: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867267.82564: getting variables 15247 1726867267.82565: in VariableManager get_vars() 15247 1726867267.82575: Calling all_inventory to load vars for managed_node2 15247 1726867267.82579: Calling groups_inventory to load vars for managed_node2 15247 1726867267.82581: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867267.82586: Calling all_plugins_play to load vars for managed_node2 15247 1726867267.82588: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867267.82591: Calling groups_plugins_play to load vars for managed_node2 15247 1726867267.83788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867267.85407: done with get_vars() 15247 1726867267.85430: done getting variables 15247 1726867267.85490: in VariableManager get_vars() 15247 1726867267.85500: Calling all_inventory to load vars for managed_node2 15247 1726867267.85501: Calling groups_inventory to load vars for managed_node2 15247 1726867267.85503: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867267.85507: Calling all_plugins_play to load vars for managed_node2 15247 1726867267.85509: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867267.85511: Calling groups_plugins_play to load vars for managed_node2 15247 1726867267.86821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867267.88555: done with get_vars() 15247 1726867267.88580: done queuing things up, now waiting for results queue to drain 15247 1726867267.88582: results queue empty 15247 1726867267.88583: checking for any_errors_fatal 15247 1726867267.88584: done checking for any_errors_fatal 15247 1726867267.88585: checking for max_fail_percentage 15247 1726867267.88586: done checking for max_fail_percentage 15247 1726867267.88587: checking to see if all hosts have failed and the running result is not ok 15247 1726867267.88588: done checking to see if all hosts have failed 15247 1726867267.88588: getting the remaining hosts for this loop 15247 1726867267.88589: done getting the remaining hosts for this loop 15247 1726867267.88592: getting the next task for host managed_node2 15247 1726867267.88595: done getting next task for host managed_node2 15247 1726867267.88596: ^ task is: None 15247 1726867267.88597: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867267.88598: done queuing things up, now waiting for results queue to drain 15247 1726867267.88599: results queue empty 15247 1726867267.88600: checking for any_errors_fatal 15247 1726867267.88600: done checking for any_errors_fatal 15247 1726867267.88601: checking for max_fail_percentage 15247 1726867267.88602: done checking for max_fail_percentage 15247 1726867267.88603: checking to see if all hosts have failed and the running result is not ok 15247 1726867267.88603: done checking to see if all hosts have failed 15247 1726867267.88604: getting the next task for host managed_node2 15247 1726867267.88607: done getting next task for host managed_node2 15247 1726867267.88608: ^ task is: None 15247 1726867267.88609: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867267.88658: in VariableManager get_vars() 15247 1726867267.88673: done with get_vars() 15247 1726867267.88681: in VariableManager get_vars() 15247 1726867267.88691: done with get_vars() 15247 1726867267.88696: variable 'omit' from source: magic vars 15247 1726867267.88826: variable 'task' from source: play vars 15247 1726867267.88862: in VariableManager get_vars() 15247 1726867267.88872: done with get_vars() 15247 1726867267.88891: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_absent.yml] ************************ 15247 1726867267.89157: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15247 1726867267.89190: getting the remaining hosts for this loop 15247 1726867267.89192: done getting the remaining hosts for this loop 15247 1726867267.89194: getting the next task for host managed_node2 15247 1726867267.89197: done getting next task for host managed_node2 15247 1726867267.89199: ^ task is: TASK: Gathering Facts 15247 1726867267.89201: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867267.89203: getting variables 15247 1726867267.89203: in VariableManager get_vars() 15247 1726867267.89212: Calling all_inventory to load vars for managed_node2 15247 1726867267.89214: Calling groups_inventory to load vars for managed_node2 15247 1726867267.89216: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867267.89221: Calling all_plugins_play to load vars for managed_node2 15247 1726867267.89223: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867267.89226: Calling groups_plugins_play to load vars for managed_node2 15247 1726867267.90354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867267.92275: done with get_vars() 15247 1726867267.92292: done getting variables 15247 1726867267.92324: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 17:21:07 -0400 (0:00:00.556) 0:00:37.633 ****** 15247 1726867267.92342: entering _queue_task() for managed_node2/gather_facts 15247 1726867267.92583: worker is 1 (out of 1 available) 15247 1726867267.92597: exiting _queue_task() for managed_node2/gather_facts 15247 1726867267.92608: done queuing things up, now waiting for results queue to drain 15247 1726867267.92609: waiting for pending results... 15247 1726867267.92781: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15247 1726867267.92851: in run() - task 0affcac9-a3a5-8ce3-1923-00000000046e 15247 1726867267.92864: variable 'ansible_search_path' from source: unknown 15247 1726867267.92894: calling self._execute() 15247 1726867267.92962: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867267.92966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867267.92975: variable 'omit' from source: magic vars 15247 1726867267.93247: variable 'ansible_distribution_major_version' from source: facts 15247 1726867267.93256: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867267.93263: variable 'omit' from source: magic vars 15247 1726867267.93286: variable 'omit' from source: magic vars 15247 1726867267.93310: variable 'omit' from source: magic vars 15247 1726867267.93341: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867267.93367: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867267.93391: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867267.93399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867267.93409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867267.93433: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867267.93436: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867267.93438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867267.93512: Set connection var ansible_shell_executable to /bin/sh 15247 1726867267.93518: Set connection var ansible_connection to ssh 15247 1726867267.93521: Set connection var ansible_shell_type to sh 15247 1726867267.93523: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867267.93529: Set connection var ansible_timeout to 10 15247 1726867267.93534: Set connection var ansible_pipelining to False 15247 1726867267.93551: variable 'ansible_shell_executable' from source: unknown 15247 1726867267.93554: variable 'ansible_connection' from source: unknown 15247 1726867267.93556: variable 'ansible_module_compression' from source: unknown 15247 1726867267.93559: variable 'ansible_shell_type' from source: unknown 15247 1726867267.93561: variable 'ansible_shell_executable' from source: unknown 15247 1726867267.93563: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867267.93568: variable 'ansible_pipelining' from source: unknown 15247 1726867267.93570: variable 'ansible_timeout' from source: unknown 15247 1726867267.93575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867267.93744: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867267.93932: variable 'omit' from source: magic vars 15247 1726867267.93935: starting attempt loop 15247 1726867267.93938: running the handler 15247 1726867267.93940: variable 'ansible_facts' from source: unknown 15247 1726867267.93942: _low_level_execute_command(): starting 15247 1726867267.93944: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867267.94503: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867267.94575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867267.94600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867267.94689: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867267.94720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867267.94753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867267.94812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867267.96532: stdout chunk (state=3): >>>/root <<< 15247 1726867267.96664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867267.96667: stdout chunk (state=3): >>><<< 15247 1726867267.96670: stderr chunk (state=3): >>><<< 15247 1726867267.96805: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867267.96809: _low_level_execute_command(): starting 15247 1726867267.96812: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867267.967023-16992-21149408034397 `" && echo ansible-tmp-1726867267.967023-16992-21149408034397="` echo /root/.ansible/tmp/ansible-tmp-1726867267.967023-16992-21149408034397 `" ) && sleep 0' 15247 1726867267.97383: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867267.97386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867267.97389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867267.97491: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867267.97495: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867267.97510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867267.97527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867267.99448: stdout chunk (state=3): >>>ansible-tmp-1726867267.967023-16992-21149408034397=/root/.ansible/tmp/ansible-tmp-1726867267.967023-16992-21149408034397 <<< 15247 1726867267.99655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867267.99659: stdout chunk (state=3): >>><<< 15247 1726867267.99662: stderr chunk (state=3): >>><<< 15247 1726867267.99683: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867267.967023-16992-21149408034397=/root/.ansible/tmp/ansible-tmp-1726867267.967023-16992-21149408034397 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867267.99811: variable 'ansible_module_compression' from source: unknown 15247 1726867267.99814: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15247 1726867267.99857: variable 'ansible_facts' from source: unknown 15247 1726867268.00084: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867267.967023-16992-21149408034397/AnsiballZ_setup.py 15247 1726867268.00303: Sending initial data 15247 1726867268.00306: Sent initial data (152 bytes) 15247 1726867268.00869: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867268.00885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867268.00948: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867268.01005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867268.01008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867268.01048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867268.02603: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15247 1726867268.02609: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867268.02640: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867268.02678: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpyp7w55me /root/.ansible/tmp/ansible-tmp-1726867267.967023-16992-21149408034397/AnsiballZ_setup.py <<< 15247 1726867268.02685: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867267.967023-16992-21149408034397/AnsiballZ_setup.py" <<< 15247 1726867268.02714: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpyp7w55me" to remote "/root/.ansible/tmp/ansible-tmp-1726867267.967023-16992-21149408034397/AnsiballZ_setup.py" <<< 15247 1726867268.02721: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867267.967023-16992-21149408034397/AnsiballZ_setup.py" <<< 15247 1726867268.04085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867268.04089: stdout chunk (state=3): >>><<< 15247 1726867268.04091: stderr chunk (state=3): >>><<< 15247 1726867268.04093: done transferring module to remote 15247 1726867268.04095: _low_level_execute_command(): starting 15247 1726867268.04097: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867267.967023-16992-21149408034397/ /root/.ansible/tmp/ansible-tmp-1726867267.967023-16992-21149408034397/AnsiballZ_setup.py && sleep 0' 15247 1726867268.04848: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867268.04867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867268.04889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867268.04906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867268.04923: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867268.04944: stderr chunk (state=3): >>>debug2: match not found <<< 15247 1726867268.04973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867268.05047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867268.05090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867268.05118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867268.05147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867268.06965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867268.06970: stdout chunk (state=3): >>><<< 15247 1726867268.06973: stderr chunk (state=3): >>><<< 15247 1726867268.06995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867268.07087: _low_level_execute_command(): starting 15247 1726867268.07092: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867267.967023-16992-21149408034397/AnsiballZ_setup.py && sleep 0' 15247 1726867268.07590: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867268.07611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867268.07630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867268.07641: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867268.07696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867268.07721: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867268.07764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867268.70155: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "08", "epoch": "1726867268", "epoch_int": "1726867268", "date": "2024-09-20", "time": "17:21:08", "iso8601_micro": "2024-09-20T21:21:08.341560Z", "iso8601": "2024-09-20T21:21:08Z", "iso8601_basic": "20240920T172108341560", "iso8601_basic_short": "20240920T172108", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.61865234375, "5m": 0.40576171875, "15m": 0.20068359375}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "Ge<<< 15247 1726867268.70172: stdout chunk (state=3): >>>nuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2961, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 570, "free": 2961}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 506, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796978688, "block_size": 4096, "block_total": 65519099, "block_available": 63915278, "block_used": 1603821, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_of<<< 15247 1726867268.70187: stdout chunk (state=3): >>>fload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gr<<< 15247 1726867268.70203: stdout chunk (state=3): >>>o_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15247 1726867268.72134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867268.72156: stderr chunk (state=3): >>><<< 15247 1726867268.72159: stdout chunk (state=3): >>><<< 15247 1726867268.72192: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "08", "epoch": "1726867268", "epoch_int": "1726867268", "date": "2024-09-20", "time": "17:21:08", "iso8601_micro": "2024-09-20T21:21:08.341560Z", "iso8601": "2024-09-20T21:21:08Z", "iso8601_basic": "20240920T172108341560", "iso8601_basic_short": "20240920T172108", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.61865234375, "5m": 0.40576171875, "15m": 0.20068359375}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2961, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 570, "free": 2961}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 506, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261796978688, "block_size": 4096, "block_total": 65519099, "block_available": 63915278, "block_used": 1603821, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_iscsi_iqn": "", "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867268.72421: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867267.967023-16992-21149408034397/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867268.72439: _low_level_execute_command(): starting 15247 1726867268.72444: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867267.967023-16992-21149408034397/ > /dev/null 2>&1 && sleep 0' 15247 1726867268.72886: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867268.72889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867268.72892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867268.72894: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867268.72897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867268.72952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867268.72958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867268.72960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867268.72999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867268.74806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867268.74834: stderr chunk (state=3): >>><<< 15247 1726867268.74838: stdout chunk (state=3): >>><<< 15247 1726867268.74851: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867268.74859: handler run complete 15247 1726867268.74932: variable 'ansible_facts' from source: unknown 15247 1726867268.75105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867268.75282: variable 'ansible_facts' from source: unknown 15247 1726867268.75335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867268.75411: attempt loop complete, returning result 15247 1726867268.75417: _execute() done 15247 1726867268.75419: dumping result to json 15247 1726867268.75437: done dumping result, returning 15247 1726867268.75443: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-8ce3-1923-00000000046e] 15247 1726867268.75448: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000046e ok: [managed_node2] 15247 1726867268.76004: no more pending results, returning what we have 15247 1726867268.76006: results queue empty 15247 1726867268.76007: checking for any_errors_fatal 15247 1726867268.76008: done checking for any_errors_fatal 15247 1726867268.76008: checking for max_fail_percentage 15247 1726867268.76009: done checking for max_fail_percentage 15247 1726867268.76010: checking to see if all hosts have failed and the running result is not ok 15247 1726867268.76010: done checking to see if all hosts have failed 15247 1726867268.76011: getting the remaining hosts for this loop 15247 1726867268.76012: done getting the remaining hosts for this loop 15247 1726867268.76014: getting the next task for host managed_node2 15247 1726867268.76018: done getting next task for host managed_node2 15247 1726867268.76019: ^ task is: TASK: meta (flush_handlers) 15247 1726867268.76021: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867268.76023: getting variables 15247 1726867268.76024: in VariableManager get_vars() 15247 1726867268.76040: Calling all_inventory to load vars for managed_node2 15247 1726867268.76041: Calling groups_inventory to load vars for managed_node2 15247 1726867268.76043: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867268.76052: Calling all_plugins_play to load vars for managed_node2 15247 1726867268.76054: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867268.76056: Calling groups_plugins_play to load vars for managed_node2 15247 1726867268.76573: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000046e 15247 1726867268.76578: WORKER PROCESS EXITING 15247 1726867268.76789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867268.77639: done with get_vars() 15247 1726867268.77654: done getting variables 15247 1726867268.77703: in VariableManager get_vars() 15247 1726867268.77711: Calling all_inventory to load vars for managed_node2 15247 1726867268.77713: Calling groups_inventory to load vars for managed_node2 15247 1726867268.77715: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867268.77718: Calling all_plugins_play to load vars for managed_node2 15247 1726867268.77720: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867268.77721: Calling groups_plugins_play to load vars for managed_node2 15247 1726867268.78415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867268.79268: done with get_vars() 15247 1726867268.79285: done queuing things up, now waiting for results queue to drain 15247 1726867268.79287: results queue empty 15247 1726867268.79287: checking for any_errors_fatal 15247 1726867268.79289: done checking for any_errors_fatal 15247 1726867268.79293: checking for max_fail_percentage 15247 1726867268.79294: done checking for max_fail_percentage 15247 1726867268.79294: checking to see if all hosts have failed and the running result is not ok 15247 1726867268.79295: done checking to see if all hosts have failed 15247 1726867268.79295: getting the remaining hosts for this loop 15247 1726867268.79296: done getting the remaining hosts for this loop 15247 1726867268.79297: getting the next task for host managed_node2 15247 1726867268.79300: done getting next task for host managed_node2 15247 1726867268.79301: ^ task is: TASK: Include the task '{{ task }}' 15247 1726867268.79302: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867268.79304: getting variables 15247 1726867268.79304: in VariableManager get_vars() 15247 1726867268.79309: Calling all_inventory to load vars for managed_node2 15247 1726867268.79310: Calling groups_inventory to load vars for managed_node2 15247 1726867268.79312: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867268.79315: Calling all_plugins_play to load vars for managed_node2 15247 1726867268.79317: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867268.79318: Calling groups_plugins_play to load vars for managed_node2 15247 1726867268.79935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867268.80768: done with get_vars() 15247 1726867268.80783: done getting variables 15247 1726867268.80896: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_absent.yml'] ********************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 17:21:08 -0400 (0:00:00.885) 0:00:38.518 ****** 15247 1726867268.80918: entering _queue_task() for managed_node2/include_tasks 15247 1726867268.81140: worker is 1 (out of 1 available) 15247 1726867268.81152: exiting _queue_task() for managed_node2/include_tasks 15247 1726867268.81163: done queuing things up, now waiting for results queue to drain 15247 1726867268.81165: waiting for pending results... 15247 1726867268.81337: running TaskExecutor() for managed_node2/TASK: Include the task 'tasks/assert_profile_absent.yml' 15247 1726867268.81408: in run() - task 0affcac9-a3a5-8ce3-1923-000000000073 15247 1726867268.81422: variable 'ansible_search_path' from source: unknown 15247 1726867268.81448: calling self._execute() 15247 1726867268.81516: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867268.81523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867268.81538: variable 'omit' from source: magic vars 15247 1726867268.81794: variable 'ansible_distribution_major_version' from source: facts 15247 1726867268.81803: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867268.81809: variable 'task' from source: play vars 15247 1726867268.81861: variable 'task' from source: play vars 15247 1726867268.81867: _execute() done 15247 1726867268.81870: dumping result to json 15247 1726867268.81873: done dumping result, returning 15247 1726867268.81881: done running TaskExecutor() for managed_node2/TASK: Include the task 'tasks/assert_profile_absent.yml' [0affcac9-a3a5-8ce3-1923-000000000073] 15247 1726867268.81887: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000073 15247 1726867268.81982: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000073 15247 1726867268.81984: WORKER PROCESS EXITING 15247 1726867268.82008: no more pending results, returning what we have 15247 1726867268.82015: in VariableManager get_vars() 15247 1726867268.82046: Calling all_inventory to load vars for managed_node2 15247 1726867268.82049: Calling groups_inventory to load vars for managed_node2 15247 1726867268.82052: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867268.82063: Calling all_plugins_play to load vars for managed_node2 15247 1726867268.82065: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867268.82068: Calling groups_plugins_play to load vars for managed_node2 15247 1726867268.82864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867268.83727: done with get_vars() 15247 1726867268.83739: variable 'ansible_search_path' from source: unknown 15247 1726867268.83748: we have included files to process 15247 1726867268.83749: generating all_blocks data 15247 1726867268.83750: done generating all_blocks data 15247 1726867268.83750: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15247 1726867268.83751: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15247 1726867268.83752: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 15247 1726867268.83858: in VariableManager get_vars() 15247 1726867268.83868: done with get_vars() 15247 1726867268.83941: done processing included file 15247 1726867268.83942: iterating over new_blocks loaded from include file 15247 1726867268.83943: in VariableManager get_vars() 15247 1726867268.83950: done with get_vars() 15247 1726867268.83951: filtering new block on tags 15247 1726867268.83961: done filtering new block on tags 15247 1726867268.83962: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 15247 1726867268.83965: extending task lists for all hosts with included blocks 15247 1726867268.83985: done extending task lists 15247 1726867268.83985: done processing included files 15247 1726867268.83986: results queue empty 15247 1726867268.83987: checking for any_errors_fatal 15247 1726867268.83988: done checking for any_errors_fatal 15247 1726867268.83988: checking for max_fail_percentage 15247 1726867268.83989: done checking for max_fail_percentage 15247 1726867268.83990: checking to see if all hosts have failed and the running result is not ok 15247 1726867268.83991: done checking to see if all hosts have failed 15247 1726867268.83991: getting the remaining hosts for this loop 15247 1726867268.83992: done getting the remaining hosts for this loop 15247 1726867268.83994: getting the next task for host managed_node2 15247 1726867268.83996: done getting next task for host managed_node2 15247 1726867268.83998: ^ task is: TASK: Include the task 'get_profile_stat.yml' 15247 1726867268.83999: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867268.84001: getting variables 15247 1726867268.84002: in VariableManager get_vars() 15247 1726867268.84007: Calling all_inventory to load vars for managed_node2 15247 1726867268.84008: Calling groups_inventory to load vars for managed_node2 15247 1726867268.84009: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867268.84012: Calling all_plugins_play to load vars for managed_node2 15247 1726867268.84016: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867268.84018: Calling groups_plugins_play to load vars for managed_node2 15247 1726867268.84640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867268.85512: done with get_vars() 15247 1726867268.85528: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 17:21:08 -0400 (0:00:00.046) 0:00:38.565 ****** 15247 1726867268.85608: entering _queue_task() for managed_node2/include_tasks 15247 1726867268.85835: worker is 1 (out of 1 available) 15247 1726867268.85847: exiting _queue_task() for managed_node2/include_tasks 15247 1726867268.85860: done queuing things up, now waiting for results queue to drain 15247 1726867268.85861: waiting for pending results... 15247 1726867268.86095: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 15247 1726867268.86151: in run() - task 0affcac9-a3a5-8ce3-1923-00000000047f 15247 1726867268.86161: variable 'ansible_search_path' from source: unknown 15247 1726867268.86164: variable 'ansible_search_path' from source: unknown 15247 1726867268.86199: calling self._execute() 15247 1726867268.86267: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867268.86272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867268.86283: variable 'omit' from source: magic vars 15247 1726867268.86566: variable 'ansible_distribution_major_version' from source: facts 15247 1726867268.86570: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867268.86573: _execute() done 15247 1726867268.86581: dumping result to json 15247 1726867268.86586: done dumping result, returning 15247 1726867268.86589: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-8ce3-1923-00000000047f] 15247 1726867268.86595: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000047f 15247 1726867268.86675: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000047f 15247 1726867268.86680: WORKER PROCESS EXITING 15247 1726867268.86706: no more pending results, returning what we have 15247 1726867268.86711: in VariableManager get_vars() 15247 1726867268.86744: Calling all_inventory to load vars for managed_node2 15247 1726867268.86747: Calling groups_inventory to load vars for managed_node2 15247 1726867268.86750: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867268.86760: Calling all_plugins_play to load vars for managed_node2 15247 1726867268.86762: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867268.86765: Calling groups_plugins_play to load vars for managed_node2 15247 1726867268.87545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867268.88395: done with get_vars() 15247 1726867268.88407: variable 'ansible_search_path' from source: unknown 15247 1726867268.88408: variable 'ansible_search_path' from source: unknown 15247 1726867268.88415: variable 'task' from source: play vars 15247 1726867268.88483: variable 'task' from source: play vars 15247 1726867268.88507: we have included files to process 15247 1726867268.88508: generating all_blocks data 15247 1726867268.88509: done generating all_blocks data 15247 1726867268.88509: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15247 1726867268.88510: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15247 1726867268.88512: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 15247 1726867268.89081: done processing included file 15247 1726867268.89083: iterating over new_blocks loaded from include file 15247 1726867268.89084: in VariableManager get_vars() 15247 1726867268.89091: done with get_vars() 15247 1726867268.89092: filtering new block on tags 15247 1726867268.89105: done filtering new block on tags 15247 1726867268.89107: in VariableManager get_vars() 15247 1726867268.89115: done with get_vars() 15247 1726867268.89116: filtering new block on tags 15247 1726867268.89127: done filtering new block on tags 15247 1726867268.89129: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 15247 1726867268.89132: extending task lists for all hosts with included blocks 15247 1726867268.89191: done extending task lists 15247 1726867268.89192: done processing included files 15247 1726867268.89193: results queue empty 15247 1726867268.89193: checking for any_errors_fatal 15247 1726867268.89196: done checking for any_errors_fatal 15247 1726867268.89196: checking for max_fail_percentage 15247 1726867268.89197: done checking for max_fail_percentage 15247 1726867268.89197: checking to see if all hosts have failed and the running result is not ok 15247 1726867268.89198: done checking to see if all hosts have failed 15247 1726867268.89198: getting the remaining hosts for this loop 15247 1726867268.89199: done getting the remaining hosts for this loop 15247 1726867268.89201: getting the next task for host managed_node2 15247 1726867268.89203: done getting next task for host managed_node2 15247 1726867268.89204: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 15247 1726867268.89206: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867268.89207: getting variables 15247 1726867268.89208: in VariableManager get_vars() 15247 1726867268.89215: Calling all_inventory to load vars for managed_node2 15247 1726867268.89216: Calling groups_inventory to load vars for managed_node2 15247 1726867268.89218: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867268.89221: Calling all_plugins_play to load vars for managed_node2 15247 1726867268.89222: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867268.89224: Calling groups_plugins_play to load vars for managed_node2 15247 1726867268.89889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867268.93584: done with get_vars() 15247 1726867268.93598: done getting variables 15247 1726867268.93631: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:21:08 -0400 (0:00:00.080) 0:00:38.646 ****** 15247 1726867268.93648: entering _queue_task() for managed_node2/set_fact 15247 1726867268.93882: worker is 1 (out of 1 available) 15247 1726867268.93893: exiting _queue_task() for managed_node2/set_fact 15247 1726867268.93905: done queuing things up, now waiting for results queue to drain 15247 1726867268.93906: waiting for pending results... 15247 1726867268.94074: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 15247 1726867268.94170: in run() - task 0affcac9-a3a5-8ce3-1923-00000000048a 15247 1726867268.94242: variable 'ansible_search_path' from source: unknown 15247 1726867268.94246: variable 'ansible_search_path' from source: unknown 15247 1726867268.94249: calling self._execute() 15247 1726867268.94286: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867268.94293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867268.94303: variable 'omit' from source: magic vars 15247 1726867268.94568: variable 'ansible_distribution_major_version' from source: facts 15247 1726867268.94579: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867268.94585: variable 'omit' from source: magic vars 15247 1726867268.94625: variable 'omit' from source: magic vars 15247 1726867268.94648: variable 'omit' from source: magic vars 15247 1726867268.94681: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867268.94711: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867268.94729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867268.94742: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867268.94751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867268.94773: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867268.94778: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867268.94782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867268.94854: Set connection var ansible_shell_executable to /bin/sh 15247 1726867268.94857: Set connection var ansible_connection to ssh 15247 1726867268.94860: Set connection var ansible_shell_type to sh 15247 1726867268.94863: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867268.94870: Set connection var ansible_timeout to 10 15247 1726867268.94874: Set connection var ansible_pipelining to False 15247 1726867268.94894: variable 'ansible_shell_executable' from source: unknown 15247 1726867268.94897: variable 'ansible_connection' from source: unknown 15247 1726867268.94900: variable 'ansible_module_compression' from source: unknown 15247 1726867268.94902: variable 'ansible_shell_type' from source: unknown 15247 1726867268.94906: variable 'ansible_shell_executable' from source: unknown 15247 1726867268.94909: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867268.94912: variable 'ansible_pipelining' from source: unknown 15247 1726867268.94915: variable 'ansible_timeout' from source: unknown 15247 1726867268.94917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867268.95013: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867268.95032: variable 'omit' from source: magic vars 15247 1726867268.95038: starting attempt loop 15247 1726867268.95042: running the handler 15247 1726867268.95044: handler run complete 15247 1726867268.95052: attempt loop complete, returning result 15247 1726867268.95055: _execute() done 15247 1726867268.95058: dumping result to json 15247 1726867268.95060: done dumping result, returning 15247 1726867268.95066: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-8ce3-1923-00000000048a] 15247 1726867268.95072: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000048a ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 15247 1726867268.95207: no more pending results, returning what we have 15247 1726867268.95210: results queue empty 15247 1726867268.95211: checking for any_errors_fatal 15247 1726867268.95213: done checking for any_errors_fatal 15247 1726867268.95214: checking for max_fail_percentage 15247 1726867268.95216: done checking for max_fail_percentage 15247 1726867268.95216: checking to see if all hosts have failed and the running result is not ok 15247 1726867268.95217: done checking to see if all hosts have failed 15247 1726867268.95218: getting the remaining hosts for this loop 15247 1726867268.95219: done getting the remaining hosts for this loop 15247 1726867268.95223: getting the next task for host managed_node2 15247 1726867268.95230: done getting next task for host managed_node2 15247 1726867268.95233: ^ task is: TASK: Stat profile file 15247 1726867268.95236: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867268.95241: getting variables 15247 1726867268.95245: in VariableManager get_vars() 15247 1726867268.95273: Calling all_inventory to load vars for managed_node2 15247 1726867268.95276: Calling groups_inventory to load vars for managed_node2 15247 1726867268.95280: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867268.95290: Calling all_plugins_play to load vars for managed_node2 15247 1726867268.95292: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867268.95295: Calling groups_plugins_play to load vars for managed_node2 15247 1726867268.96043: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000048a 15247 1726867268.96047: WORKER PROCESS EXITING 15247 1726867268.96057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867268.97195: done with get_vars() 15247 1726867268.97244: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:21:08 -0400 (0:00:00.037) 0:00:38.683 ****** 15247 1726867268.97372: entering _queue_task() for managed_node2/stat 15247 1726867268.97762: worker is 1 (out of 1 available) 15247 1726867268.97775: exiting _queue_task() for managed_node2/stat 15247 1726867268.97794: done queuing things up, now waiting for results queue to drain 15247 1726867268.97796: waiting for pending results... 15247 1726867268.98042: running TaskExecutor() for managed_node2/TASK: Stat profile file 15247 1726867268.98211: in run() - task 0affcac9-a3a5-8ce3-1923-00000000048b 15247 1726867268.98220: variable 'ansible_search_path' from source: unknown 15247 1726867268.98223: variable 'ansible_search_path' from source: unknown 15247 1726867268.98256: calling self._execute() 15247 1726867268.98321: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867268.98328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867268.98339: variable 'omit' from source: magic vars 15247 1726867268.98789: variable 'ansible_distribution_major_version' from source: facts 15247 1726867268.98792: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867268.98796: variable 'omit' from source: magic vars 15247 1726867268.98799: variable 'omit' from source: magic vars 15247 1726867268.98853: variable 'profile' from source: play vars 15247 1726867268.98863: variable 'interface' from source: set_fact 15247 1726867268.98933: variable 'interface' from source: set_fact 15247 1726867268.98954: variable 'omit' from source: magic vars 15247 1726867268.98999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867268.99041: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867268.99063: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867268.99086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867268.99099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867268.99136: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867268.99143: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867268.99149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867268.99256: Set connection var ansible_shell_executable to /bin/sh 15247 1726867268.99265: Set connection var ansible_connection to ssh 15247 1726867268.99333: Set connection var ansible_shell_type to sh 15247 1726867268.99336: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867268.99338: Set connection var ansible_timeout to 10 15247 1726867268.99341: Set connection var ansible_pipelining to False 15247 1726867268.99342: variable 'ansible_shell_executable' from source: unknown 15247 1726867268.99344: variable 'ansible_connection' from source: unknown 15247 1726867268.99346: variable 'ansible_module_compression' from source: unknown 15247 1726867268.99348: variable 'ansible_shell_type' from source: unknown 15247 1726867268.99350: variable 'ansible_shell_executable' from source: unknown 15247 1726867268.99352: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867268.99353: variable 'ansible_pipelining' from source: unknown 15247 1726867268.99361: variable 'ansible_timeout' from source: unknown 15247 1726867268.99368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867268.99563: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867268.99580: variable 'omit' from source: magic vars 15247 1726867268.99590: starting attempt loop 15247 1726867268.99596: running the handler 15247 1726867268.99611: _low_level_execute_command(): starting 15247 1726867268.99660: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867269.00394: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867269.00459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867269.00491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867269.00583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867269.02237: stdout chunk (state=3): >>>/root <<< 15247 1726867269.02338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867269.02364: stderr chunk (state=3): >>><<< 15247 1726867269.02368: stdout chunk (state=3): >>><<< 15247 1726867269.02389: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867269.02400: _low_level_execute_command(): starting 15247 1726867269.02406: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867269.0238833-17029-55650638684210 `" && echo ansible-tmp-1726867269.0238833-17029-55650638684210="` echo /root/.ansible/tmp/ansible-tmp-1726867269.0238833-17029-55650638684210 `" ) && sleep 0' 15247 1726867269.02993: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867269.02996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867269.03002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867269.03023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867269.03036: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867269.03047: stderr chunk (state=3): >>>debug2: match not found <<< 15247 1726867269.03050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867269.03065: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15247 1726867269.03073: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 15247 1726867269.03081: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15247 1726867269.03097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867269.03182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867269.03186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867269.03189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867269.03191: stderr chunk (state=3): >>>debug2: match found <<< 15247 1726867269.03194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867269.03209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867269.03248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867269.03287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867269.05166: stdout chunk (state=3): >>>ansible-tmp-1726867269.0238833-17029-55650638684210=/root/.ansible/tmp/ansible-tmp-1726867269.0238833-17029-55650638684210 <<< 15247 1726867269.05306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867269.05317: stdout chunk (state=3): >>><<< 15247 1726867269.05329: stderr chunk (state=3): >>><<< 15247 1726867269.05352: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867269.0238833-17029-55650638684210=/root/.ansible/tmp/ansible-tmp-1726867269.0238833-17029-55650638684210 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867269.05409: variable 'ansible_module_compression' from source: unknown 15247 1726867269.05484: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15247 1726867269.05528: variable 'ansible_facts' from source: unknown 15247 1726867269.05687: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867269.0238833-17029-55650638684210/AnsiballZ_stat.py 15247 1726867269.05860: Sending initial data 15247 1726867269.05863: Sent initial data (152 bytes) 15247 1726867269.06456: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867269.06472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867269.06576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867269.08086: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 15247 1726867269.08101: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867269.08127: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867269.08166: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp6ttr9080 /root/.ansible/tmp/ansible-tmp-1726867269.0238833-17029-55650638684210/AnsiballZ_stat.py <<< 15247 1726867269.08169: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867269.0238833-17029-55650638684210/AnsiballZ_stat.py" <<< 15247 1726867269.08203: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp6ttr9080" to remote "/root/.ansible/tmp/ansible-tmp-1726867269.0238833-17029-55650638684210/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867269.0238833-17029-55650638684210/AnsiballZ_stat.py" <<< 15247 1726867269.08705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867269.08750: stderr chunk (state=3): >>><<< 15247 1726867269.08754: stdout chunk (state=3): >>><<< 15247 1726867269.08756: done transferring module to remote 15247 1726867269.08765: _low_level_execute_command(): starting 15247 1726867269.08768: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867269.0238833-17029-55650638684210/ /root/.ansible/tmp/ansible-tmp-1726867269.0238833-17029-55650638684210/AnsiballZ_stat.py && sleep 0' 15247 1726867269.09365: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867269.09452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867269.09456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867269.09458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867269.09504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867269.11264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867269.11285: stderr chunk (state=3): >>><<< 15247 1726867269.11288: stdout chunk (state=3): >>><<< 15247 1726867269.11301: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867269.11304: _low_level_execute_command(): starting 15247 1726867269.11311: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867269.0238833-17029-55650638684210/AnsiballZ_stat.py && sleep 0' 15247 1726867269.11857: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867269.11862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867269.11864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867269.11867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867269.11918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867269.27264: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15247 1726867269.28651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867269.28655: stdout chunk (state=3): >>><<< 15247 1726867269.28657: stderr chunk (state=3): >>><<< 15247 1726867269.28676: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867269.28797: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867269.0238833-17029-55650638684210/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867269.28801: _low_level_execute_command(): starting 15247 1726867269.28803: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867269.0238833-17029-55650638684210/ > /dev/null 2>&1 && sleep 0' 15247 1726867269.29358: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867269.29372: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867269.29394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867269.29431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867269.29452: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15247 1726867269.29543: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867269.29558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867269.29580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867269.29602: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867269.29671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867269.31512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867269.31537: stderr chunk (state=3): >>><<< 15247 1726867269.31541: stdout chunk (state=3): >>><<< 15247 1726867269.31554: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867269.31559: handler run complete 15247 1726867269.31574: attempt loop complete, returning result 15247 1726867269.31578: _execute() done 15247 1726867269.31581: dumping result to json 15247 1726867269.31583: done dumping result, returning 15247 1726867269.31593: done running TaskExecutor() for managed_node2/TASK: Stat profile file [0affcac9-a3a5-8ce3-1923-00000000048b] 15247 1726867269.31598: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000048b 15247 1726867269.31687: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000048b 15247 1726867269.31689: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 15247 1726867269.31744: no more pending results, returning what we have 15247 1726867269.31747: results queue empty 15247 1726867269.31748: checking for any_errors_fatal 15247 1726867269.31761: done checking for any_errors_fatal 15247 1726867269.31762: checking for max_fail_percentage 15247 1726867269.31764: done checking for max_fail_percentage 15247 1726867269.31764: checking to see if all hosts have failed and the running result is not ok 15247 1726867269.31765: done checking to see if all hosts have failed 15247 1726867269.31766: getting the remaining hosts for this loop 15247 1726867269.31767: done getting the remaining hosts for this loop 15247 1726867269.31771: getting the next task for host managed_node2 15247 1726867269.31779: done getting next task for host managed_node2 15247 1726867269.31783: ^ task is: TASK: Set NM profile exist flag based on the profile files 15247 1726867269.31786: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867269.31789: getting variables 15247 1726867269.31791: in VariableManager get_vars() 15247 1726867269.31822: Calling all_inventory to load vars for managed_node2 15247 1726867269.31824: Calling groups_inventory to load vars for managed_node2 15247 1726867269.31828: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867269.31839: Calling all_plugins_play to load vars for managed_node2 15247 1726867269.31842: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867269.31844: Calling groups_plugins_play to load vars for managed_node2 15247 1726867269.33064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867269.34516: done with get_vars() 15247 1726867269.34537: done getting variables 15247 1726867269.34595: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:21:09 -0400 (0:00:00.372) 0:00:39.055 ****** 15247 1726867269.34627: entering _queue_task() for managed_node2/set_fact 15247 1726867269.34910: worker is 1 (out of 1 available) 15247 1726867269.34925: exiting _queue_task() for managed_node2/set_fact 15247 1726867269.34938: done queuing things up, now waiting for results queue to drain 15247 1726867269.34940: waiting for pending results... 15247 1726867269.35190: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 15247 1726867269.35284: in run() - task 0affcac9-a3a5-8ce3-1923-00000000048c 15247 1726867269.35296: variable 'ansible_search_path' from source: unknown 15247 1726867269.35299: variable 'ansible_search_path' from source: unknown 15247 1726867269.35331: calling self._execute() 15247 1726867269.35396: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867269.35401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867269.35412: variable 'omit' from source: magic vars 15247 1726867269.35689: variable 'ansible_distribution_major_version' from source: facts 15247 1726867269.35697: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867269.35783: variable 'profile_stat' from source: set_fact 15247 1726867269.35793: Evaluated conditional (profile_stat.stat.exists): False 15247 1726867269.35796: when evaluation is False, skipping this task 15247 1726867269.35799: _execute() done 15247 1726867269.35801: dumping result to json 15247 1726867269.35803: done dumping result, returning 15247 1726867269.35812: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-8ce3-1923-00000000048c] 15247 1726867269.35818: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000048c 15247 1726867269.35897: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000048c 15247 1726867269.35900: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15247 1726867269.35969: no more pending results, returning what we have 15247 1726867269.35972: results queue empty 15247 1726867269.35973: checking for any_errors_fatal 15247 1726867269.35980: done checking for any_errors_fatal 15247 1726867269.35981: checking for max_fail_percentage 15247 1726867269.35983: done checking for max_fail_percentage 15247 1726867269.35983: checking to see if all hosts have failed and the running result is not ok 15247 1726867269.35984: done checking to see if all hosts have failed 15247 1726867269.35985: getting the remaining hosts for this loop 15247 1726867269.35986: done getting the remaining hosts for this loop 15247 1726867269.35989: getting the next task for host managed_node2 15247 1726867269.35994: done getting next task for host managed_node2 15247 1726867269.35996: ^ task is: TASK: Get NM profile info 15247 1726867269.35999: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867269.36002: getting variables 15247 1726867269.36003: in VariableManager get_vars() 15247 1726867269.36030: Calling all_inventory to load vars for managed_node2 15247 1726867269.36032: Calling groups_inventory to load vars for managed_node2 15247 1726867269.36035: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867269.36044: Calling all_plugins_play to load vars for managed_node2 15247 1726867269.36046: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867269.36049: Calling groups_plugins_play to load vars for managed_node2 15247 1726867269.36842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867269.38128: done with get_vars() 15247 1726867269.38141: done getting variables 15247 1726867269.38183: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:21:09 -0400 (0:00:00.035) 0:00:39.091 ****** 15247 1726867269.38203: entering _queue_task() for managed_node2/shell 15247 1726867269.38392: worker is 1 (out of 1 available) 15247 1726867269.38405: exiting _queue_task() for managed_node2/shell 15247 1726867269.38419: done queuing things up, now waiting for results queue to drain 15247 1726867269.38420: waiting for pending results... 15247 1726867269.38574: running TaskExecutor() for managed_node2/TASK: Get NM profile info 15247 1726867269.38647: in run() - task 0affcac9-a3a5-8ce3-1923-00000000048d 15247 1726867269.38657: variable 'ansible_search_path' from source: unknown 15247 1726867269.38661: variable 'ansible_search_path' from source: unknown 15247 1726867269.38687: calling self._execute() 15247 1726867269.38751: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867269.38756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867269.38763: variable 'omit' from source: magic vars 15247 1726867269.39021: variable 'ansible_distribution_major_version' from source: facts 15247 1726867269.39030: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867269.39041: variable 'omit' from source: magic vars 15247 1726867269.39068: variable 'omit' from source: magic vars 15247 1726867269.39137: variable 'profile' from source: play vars 15247 1726867269.39142: variable 'interface' from source: set_fact 15247 1726867269.39188: variable 'interface' from source: set_fact 15247 1726867269.39202: variable 'omit' from source: magic vars 15247 1726867269.39233: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867269.39262: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867269.39275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867269.39290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867269.39300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867269.39325: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867269.39328: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867269.39330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867269.39400: Set connection var ansible_shell_executable to /bin/sh 15247 1726867269.39403: Set connection var ansible_connection to ssh 15247 1726867269.39406: Set connection var ansible_shell_type to sh 15247 1726867269.39410: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867269.39418: Set connection var ansible_timeout to 10 15247 1726867269.39423: Set connection var ansible_pipelining to False 15247 1726867269.39440: variable 'ansible_shell_executable' from source: unknown 15247 1726867269.39443: variable 'ansible_connection' from source: unknown 15247 1726867269.39445: variable 'ansible_module_compression' from source: unknown 15247 1726867269.39447: variable 'ansible_shell_type' from source: unknown 15247 1726867269.39450: variable 'ansible_shell_executable' from source: unknown 15247 1726867269.39452: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867269.39457: variable 'ansible_pipelining' from source: unknown 15247 1726867269.39460: variable 'ansible_timeout' from source: unknown 15247 1726867269.39462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867269.39558: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867269.39567: variable 'omit' from source: magic vars 15247 1726867269.39572: starting attempt loop 15247 1726867269.39574: running the handler 15247 1726867269.39587: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867269.39604: _low_level_execute_command(): starting 15247 1726867269.39610: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867269.40247: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867269.40305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867269.40343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867269.40413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867269.42102: stdout chunk (state=3): >>>/root <<< 15247 1726867269.42271: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867269.42275: stdout chunk (state=3): >>><<< 15247 1726867269.42279: stderr chunk (state=3): >>><<< 15247 1726867269.42418: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867269.42423: _low_level_execute_command(): starting 15247 1726867269.42426: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867269.4231222-17046-96165845330753 `" && echo ansible-tmp-1726867269.4231222-17046-96165845330753="` echo /root/.ansible/tmp/ansible-tmp-1726867269.4231222-17046-96165845330753 `" ) && sleep 0' 15247 1726867269.43075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867269.43103: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867269.43141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867269.43185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867269.45138: stdout chunk (state=3): >>>ansible-tmp-1726867269.4231222-17046-96165845330753=/root/.ansible/tmp/ansible-tmp-1726867269.4231222-17046-96165845330753 <<< 15247 1726867269.45231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867269.45254: stderr chunk (state=3): >>><<< 15247 1726867269.45258: stdout chunk (state=3): >>><<< 15247 1726867269.45272: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867269.4231222-17046-96165845330753=/root/.ansible/tmp/ansible-tmp-1726867269.4231222-17046-96165845330753 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867269.45300: variable 'ansible_module_compression' from source: unknown 15247 1726867269.45370: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15247 1726867269.45433: variable 'ansible_facts' from source: unknown 15247 1726867269.45490: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867269.4231222-17046-96165845330753/AnsiballZ_command.py 15247 1726867269.45582: Sending initial data 15247 1726867269.45585: Sent initial data (155 bytes) 15247 1726867269.46002: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867269.46005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867269.46008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 15247 1726867269.46010: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867269.46013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867269.46060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867269.46065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867269.46108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867269.47686: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 15247 1726867269.47690: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867269.47722: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867269.47758: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpjwxetaoq /root/.ansible/tmp/ansible-tmp-1726867269.4231222-17046-96165845330753/AnsiballZ_command.py <<< 15247 1726867269.47764: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867269.4231222-17046-96165845330753/AnsiballZ_command.py" <<< 15247 1726867269.47804: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpjwxetaoq" to remote "/root/.ansible/tmp/ansible-tmp-1726867269.4231222-17046-96165845330753/AnsiballZ_command.py" <<< 15247 1726867269.47806: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867269.4231222-17046-96165845330753/AnsiballZ_command.py" <<< 15247 1726867269.48550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867269.48599: stderr chunk (state=3): >>><<< 15247 1726867269.48649: stdout chunk (state=3): >>><<< 15247 1726867269.48674: done transferring module to remote 15247 1726867269.48680: _low_level_execute_command(): starting 15247 1726867269.48683: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867269.4231222-17046-96165845330753/ /root/.ansible/tmp/ansible-tmp-1726867269.4231222-17046-96165845330753/AnsiballZ_command.py && sleep 0' 15247 1726867269.49257: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867269.49261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867269.49264: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 15247 1726867269.49266: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867269.49268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867269.49427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867269.49442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867269.51165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867269.51190: stderr chunk (state=3): >>><<< 15247 1726867269.51193: stdout chunk (state=3): >>><<< 15247 1726867269.51209: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867269.51212: _low_level_execute_command(): starting 15247 1726867269.51217: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867269.4231222-17046-96165845330753/AnsiballZ_command.py && sleep 0' 15247 1726867269.51633: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867269.51637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867269.51639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867269.51641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867269.51695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867269.51700: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867269.51741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867269.68463: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 17:21:09.667361", "end": "2024-09-20 17:21:09.683191", "delta": "0:00:00.015830", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15247 1726867269.70026: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.12.116 closed. <<< 15247 1726867269.70029: stdout chunk (state=3): >>><<< 15247 1726867269.70032: stderr chunk (state=3): >>><<< 15247 1726867269.70061: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-20 17:21:09.667361", "end": "2024-09-20 17:21:09.683191", "delta": "0:00:00.015830", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.12.116 closed. 15247 1726867269.70154: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867269.4231222-17046-96165845330753/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867269.70186: _low_level_execute_command(): starting 15247 1726867269.70197: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867269.4231222-17046-96165845330753/ > /dev/null 2>&1 && sleep 0' 15247 1726867269.71094: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867269.71118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867269.71232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867269.71254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867269.71324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867269.73199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867269.73583: stderr chunk (state=3): >>><<< 15247 1726867269.73586: stdout chunk (state=3): >>><<< 15247 1726867269.73589: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867269.73591: handler run complete 15247 1726867269.73594: Evaluated conditional (False): False 15247 1726867269.73596: attempt loop complete, returning result 15247 1726867269.73598: _execute() done 15247 1726867269.73599: dumping result to json 15247 1726867269.73601: done dumping result, returning 15247 1726867269.73602: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [0affcac9-a3a5-8ce3-1923-00000000048d] 15247 1726867269.73604: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000048d 15247 1726867269.73664: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000048d 15247 1726867269.73666: WORKER PROCESS EXITING fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.015830", "end": "2024-09-20 17:21:09.683191", "rc": 1, "start": "2024-09-20 17:21:09.667361" } MSG: non-zero return code ...ignoring 15247 1726867269.73842: no more pending results, returning what we have 15247 1726867269.73845: results queue empty 15247 1726867269.73846: checking for any_errors_fatal 15247 1726867269.73852: done checking for any_errors_fatal 15247 1726867269.73853: checking for max_fail_percentage 15247 1726867269.73854: done checking for max_fail_percentage 15247 1726867269.73855: checking to see if all hosts have failed and the running result is not ok 15247 1726867269.73856: done checking to see if all hosts have failed 15247 1726867269.73856: getting the remaining hosts for this loop 15247 1726867269.73858: done getting the remaining hosts for this loop 15247 1726867269.73861: getting the next task for host managed_node2 15247 1726867269.73867: done getting next task for host managed_node2 15247 1726867269.73869: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15247 1726867269.73872: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867269.73876: getting variables 15247 1726867269.73885: in VariableManager get_vars() 15247 1726867269.73911: Calling all_inventory to load vars for managed_node2 15247 1726867269.73913: Calling groups_inventory to load vars for managed_node2 15247 1726867269.73916: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867269.73925: Calling all_plugins_play to load vars for managed_node2 15247 1726867269.73928: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867269.73931: Calling groups_plugins_play to load vars for managed_node2 15247 1726867269.75442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867269.76326: done with get_vars() 15247 1726867269.76340: done getting variables 15247 1726867269.76406: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:21:09 -0400 (0:00:00.382) 0:00:39.474 ****** 15247 1726867269.76489: entering _queue_task() for managed_node2/set_fact 15247 1726867269.76766: worker is 1 (out of 1 available) 15247 1726867269.76779: exiting _queue_task() for managed_node2/set_fact 15247 1726867269.76798: done queuing things up, now waiting for results queue to drain 15247 1726867269.76799: waiting for pending results... 15247 1726867269.77186: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 15247 1726867269.77195: in run() - task 0affcac9-a3a5-8ce3-1923-00000000048e 15247 1726867269.77214: variable 'ansible_search_path' from source: unknown 15247 1726867269.77229: variable 'ansible_search_path' from source: unknown 15247 1726867269.77280: calling self._execute() 15247 1726867269.77451: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867269.77455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867269.77458: variable 'omit' from source: magic vars 15247 1726867269.77813: variable 'ansible_distribution_major_version' from source: facts 15247 1726867269.77831: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867269.77974: variable 'nm_profile_exists' from source: set_fact 15247 1726867269.77997: Evaluated conditional (nm_profile_exists.rc == 0): False 15247 1726867269.78002: when evaluation is False, skipping this task 15247 1726867269.78007: _execute() done 15247 1726867269.78011: dumping result to json 15247 1726867269.78017: done dumping result, returning 15247 1726867269.78028: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-8ce3-1923-00000000048e] 15247 1726867269.78036: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000048e skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 15247 1726867269.78174: no more pending results, returning what we have 15247 1726867269.78180: results queue empty 15247 1726867269.78181: checking for any_errors_fatal 15247 1726867269.78198: done checking for any_errors_fatal 15247 1726867269.78199: checking for max_fail_percentage 15247 1726867269.78200: done checking for max_fail_percentage 15247 1726867269.78201: checking to see if all hosts have failed and the running result is not ok 15247 1726867269.78209: done checking to see if all hosts have failed 15247 1726867269.78210: getting the remaining hosts for this loop 15247 1726867269.78211: done getting the remaining hosts for this loop 15247 1726867269.78215: getting the next task for host managed_node2 15247 1726867269.78224: done getting next task for host managed_node2 15247 1726867269.78226: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 15247 1726867269.78230: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867269.78233: getting variables 15247 1726867269.78234: in VariableManager get_vars() 15247 1726867269.78257: Calling all_inventory to load vars for managed_node2 15247 1726867269.78259: Calling groups_inventory to load vars for managed_node2 15247 1726867269.78262: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867269.78268: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000048e 15247 1726867269.78270: WORKER PROCESS EXITING 15247 1726867269.78281: Calling all_plugins_play to load vars for managed_node2 15247 1726867269.78283: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867269.78286: Calling groups_plugins_play to load vars for managed_node2 15247 1726867269.79036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867269.80397: done with get_vars() 15247 1726867269.80418: done getting variables 15247 1726867269.80471: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15247 1726867269.80587: variable 'profile' from source: play vars 15247 1726867269.80591: variable 'interface' from source: set_fact 15247 1726867269.80647: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:21:09 -0400 (0:00:00.041) 0:00:39.516 ****** 15247 1726867269.80684: entering _queue_task() for managed_node2/command 15247 1726867269.80903: worker is 1 (out of 1 available) 15247 1726867269.80917: exiting _queue_task() for managed_node2/command 15247 1726867269.80928: done queuing things up, now waiting for results queue to drain 15247 1726867269.80929: waiting for pending results... 15247 1726867269.81135: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 15247 1726867269.81278: in run() - task 0affcac9-a3a5-8ce3-1923-000000000490 15247 1726867269.81282: variable 'ansible_search_path' from source: unknown 15247 1726867269.81285: variable 'ansible_search_path' from source: unknown 15247 1726867269.81486: calling self._execute() 15247 1726867269.81489: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867269.81492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867269.81495: variable 'omit' from source: magic vars 15247 1726867269.81779: variable 'ansible_distribution_major_version' from source: facts 15247 1726867269.81796: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867269.81934: variable 'profile_stat' from source: set_fact 15247 1726867269.81956: Evaluated conditional (profile_stat.stat.exists): False 15247 1726867269.81965: when evaluation is False, skipping this task 15247 1726867269.81972: _execute() done 15247 1726867269.81984: dumping result to json 15247 1726867269.81993: done dumping result, returning 15247 1726867269.82003: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [0affcac9-a3a5-8ce3-1923-000000000490] 15247 1726867269.82030: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000490 15247 1726867269.82140: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000490 15247 1726867269.82143: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15247 1726867269.82208: no more pending results, returning what we have 15247 1726867269.82211: results queue empty 15247 1726867269.82213: checking for any_errors_fatal 15247 1726867269.82222: done checking for any_errors_fatal 15247 1726867269.82223: checking for max_fail_percentage 15247 1726867269.82225: done checking for max_fail_percentage 15247 1726867269.82226: checking to see if all hosts have failed and the running result is not ok 15247 1726867269.82227: done checking to see if all hosts have failed 15247 1726867269.82227: getting the remaining hosts for this loop 15247 1726867269.82228: done getting the remaining hosts for this loop 15247 1726867269.82232: getting the next task for host managed_node2 15247 1726867269.82237: done getting next task for host managed_node2 15247 1726867269.82239: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 15247 1726867269.82243: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867269.82247: getting variables 15247 1726867269.82248: in VariableManager get_vars() 15247 1726867269.82273: Calling all_inventory to load vars for managed_node2 15247 1726867269.82275: Calling groups_inventory to load vars for managed_node2 15247 1726867269.82285: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867269.82294: Calling all_plugins_play to load vars for managed_node2 15247 1726867269.82296: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867269.82299: Calling groups_plugins_play to load vars for managed_node2 15247 1726867269.83082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867269.83956: done with get_vars() 15247 1726867269.83970: done getting variables 15247 1726867269.84012: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15247 1726867269.84103: variable 'profile' from source: play vars 15247 1726867269.84107: variable 'interface' from source: set_fact 15247 1726867269.84162: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:21:09 -0400 (0:00:00.035) 0:00:39.551 ****** 15247 1726867269.84194: entering _queue_task() for managed_node2/set_fact 15247 1726867269.84436: worker is 1 (out of 1 available) 15247 1726867269.84448: exiting _queue_task() for managed_node2/set_fact 15247 1726867269.84462: done queuing things up, now waiting for results queue to drain 15247 1726867269.84463: waiting for pending results... 15247 1726867269.84893: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 15247 1726867269.84898: in run() - task 0affcac9-a3a5-8ce3-1923-000000000491 15247 1726867269.84902: variable 'ansible_search_path' from source: unknown 15247 1726867269.84905: variable 'ansible_search_path' from source: unknown 15247 1726867269.84908: calling self._execute() 15247 1726867269.84972: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867269.84986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867269.85002: variable 'omit' from source: magic vars 15247 1726867269.85320: variable 'ansible_distribution_major_version' from source: facts 15247 1726867269.85328: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867269.85420: variable 'profile_stat' from source: set_fact 15247 1726867269.85427: Evaluated conditional (profile_stat.stat.exists): False 15247 1726867269.85431: when evaluation is False, skipping this task 15247 1726867269.85434: _execute() done 15247 1726867269.85436: dumping result to json 15247 1726867269.85439: done dumping result, returning 15247 1726867269.85446: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [0affcac9-a3a5-8ce3-1923-000000000491] 15247 1726867269.85453: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000491 15247 1726867269.85536: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000491 15247 1726867269.85539: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15247 1726867269.85609: no more pending results, returning what we have 15247 1726867269.85612: results queue empty 15247 1726867269.85615: checking for any_errors_fatal 15247 1726867269.85620: done checking for any_errors_fatal 15247 1726867269.85621: checking for max_fail_percentage 15247 1726867269.85622: done checking for max_fail_percentage 15247 1726867269.85623: checking to see if all hosts have failed and the running result is not ok 15247 1726867269.85624: done checking to see if all hosts have failed 15247 1726867269.85624: getting the remaining hosts for this loop 15247 1726867269.85626: done getting the remaining hosts for this loop 15247 1726867269.85628: getting the next task for host managed_node2 15247 1726867269.85633: done getting next task for host managed_node2 15247 1726867269.85635: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 15247 1726867269.85638: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867269.85642: getting variables 15247 1726867269.85643: in VariableManager get_vars() 15247 1726867269.85666: Calling all_inventory to load vars for managed_node2 15247 1726867269.85669: Calling groups_inventory to load vars for managed_node2 15247 1726867269.85672: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867269.85681: Calling all_plugins_play to load vars for managed_node2 15247 1726867269.85683: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867269.85684: Calling groups_plugins_play to load vars for managed_node2 15247 1726867269.86542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867269.87515: done with get_vars() 15247 1726867269.87538: done getting variables 15247 1726867269.87613: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15247 1726867269.87726: variable 'profile' from source: play vars 15247 1726867269.87730: variable 'interface' from source: set_fact 15247 1726867269.87794: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:21:09 -0400 (0:00:00.036) 0:00:39.587 ****** 15247 1726867269.87827: entering _queue_task() for managed_node2/command 15247 1726867269.88088: worker is 1 (out of 1 available) 15247 1726867269.88106: exiting _queue_task() for managed_node2/command 15247 1726867269.88116: done queuing things up, now waiting for results queue to drain 15247 1726867269.88118: waiting for pending results... 15247 1726867269.88414: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 15247 1726867269.88501: in run() - task 0affcac9-a3a5-8ce3-1923-000000000492 15247 1726867269.88516: variable 'ansible_search_path' from source: unknown 15247 1726867269.88523: variable 'ansible_search_path' from source: unknown 15247 1726867269.88556: calling self._execute() 15247 1726867269.88787: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867269.88792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867269.88795: variable 'omit' from source: magic vars 15247 1726867269.89316: variable 'ansible_distribution_major_version' from source: facts 15247 1726867269.89436: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867269.89547: variable 'profile_stat' from source: set_fact 15247 1726867269.89586: Evaluated conditional (profile_stat.stat.exists): False 15247 1726867269.89597: when evaluation is False, skipping this task 15247 1726867269.89616: _execute() done 15247 1726867269.89647: dumping result to json 15247 1726867269.89684: done dumping result, returning 15247 1726867269.89868: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [0affcac9-a3a5-8ce3-1923-000000000492] 15247 1726867269.89880: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000492 15247 1726867269.89957: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000492 15247 1726867269.89961: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15247 1726867269.90048: no more pending results, returning what we have 15247 1726867269.90065: results queue empty 15247 1726867269.90066: checking for any_errors_fatal 15247 1726867269.90072: done checking for any_errors_fatal 15247 1726867269.90073: checking for max_fail_percentage 15247 1726867269.90074: done checking for max_fail_percentage 15247 1726867269.90075: checking to see if all hosts have failed and the running result is not ok 15247 1726867269.90076: done checking to see if all hosts have failed 15247 1726867269.90078: getting the remaining hosts for this loop 15247 1726867269.90080: done getting the remaining hosts for this loop 15247 1726867269.90088: getting the next task for host managed_node2 15247 1726867269.90098: done getting next task for host managed_node2 15247 1726867269.90101: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 15247 1726867269.90105: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867269.90109: getting variables 15247 1726867269.90110: in VariableManager get_vars() 15247 1726867269.90171: Calling all_inventory to load vars for managed_node2 15247 1726867269.90181: Calling groups_inventory to load vars for managed_node2 15247 1726867269.90185: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867269.90193: Calling all_plugins_play to load vars for managed_node2 15247 1726867269.90196: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867269.90199: Calling groups_plugins_play to load vars for managed_node2 15247 1726867269.92062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867269.94383: done with get_vars() 15247 1726867269.94408: done getting variables 15247 1726867269.94480: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15247 1726867269.94603: variable 'profile' from source: play vars 15247 1726867269.94606: variable 'interface' from source: set_fact 15247 1726867269.94663: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:21:09 -0400 (0:00:00.068) 0:00:39.656 ****** 15247 1726867269.94702: entering _queue_task() for managed_node2/set_fact 15247 1726867269.95134: worker is 1 (out of 1 available) 15247 1726867269.95143: exiting _queue_task() for managed_node2/set_fact 15247 1726867269.95153: done queuing things up, now waiting for results queue to drain 15247 1726867269.95155: waiting for pending results... 15247 1726867269.95472: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 15247 1726867269.95489: in run() - task 0affcac9-a3a5-8ce3-1923-000000000493 15247 1726867269.95513: variable 'ansible_search_path' from source: unknown 15247 1726867269.95521: variable 'ansible_search_path' from source: unknown 15247 1726867269.95559: calling self._execute() 15247 1726867269.95688: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867269.95716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867269.95786: variable 'omit' from source: magic vars 15247 1726867269.96199: variable 'ansible_distribution_major_version' from source: facts 15247 1726867269.96222: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867269.96380: variable 'profile_stat' from source: set_fact 15247 1726867269.96400: Evaluated conditional (profile_stat.stat.exists): False 15247 1726867269.96408: when evaluation is False, skipping this task 15247 1726867269.96419: _execute() done 15247 1726867269.96428: dumping result to json 15247 1726867269.96443: done dumping result, returning 15247 1726867269.96483: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [0affcac9-a3a5-8ce3-1923-000000000493] 15247 1726867269.96490: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000493 15247 1726867269.96767: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000493 15247 1726867269.96770: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 15247 1726867269.96821: no more pending results, returning what we have 15247 1726867269.96827: results queue empty 15247 1726867269.96828: checking for any_errors_fatal 15247 1726867269.96835: done checking for any_errors_fatal 15247 1726867269.96836: checking for max_fail_percentage 15247 1726867269.96838: done checking for max_fail_percentage 15247 1726867269.96838: checking to see if all hosts have failed and the running result is not ok 15247 1726867269.96839: done checking to see if all hosts have failed 15247 1726867269.96840: getting the remaining hosts for this loop 15247 1726867269.96841: done getting the remaining hosts for this loop 15247 1726867269.96846: getting the next task for host managed_node2 15247 1726867269.96856: done getting next task for host managed_node2 15247 1726867269.96860: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 15247 1726867269.96863: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867269.96868: getting variables 15247 1726867269.96869: in VariableManager get_vars() 15247 1726867269.96906: Calling all_inventory to load vars for managed_node2 15247 1726867269.96909: Calling groups_inventory to load vars for managed_node2 15247 1726867269.96914: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867269.96929: Calling all_plugins_play to load vars for managed_node2 15247 1726867269.96934: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867269.96938: Calling groups_plugins_play to load vars for managed_node2 15247 1726867269.98349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867269.99451: done with get_vars() 15247 1726867269.99470: done getting variables 15247 1726867269.99525: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15247 1726867269.99634: variable 'profile' from source: play vars 15247 1726867269.99638: variable 'interface' from source: set_fact 15247 1726867269.99699: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'LSR-TST-br31'] ********************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 17:21:09 -0400 (0:00:00.050) 0:00:39.706 ****** 15247 1726867269.99730: entering _queue_task() for managed_node2/assert 15247 1726867270.00023: worker is 1 (out of 1 available) 15247 1726867270.00039: exiting _queue_task() for managed_node2/assert 15247 1726867270.00051: done queuing things up, now waiting for results queue to drain 15247 1726867270.00052: waiting for pending results... 15247 1726867270.00495: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'LSR-TST-br31' 15247 1726867270.00499: in run() - task 0affcac9-a3a5-8ce3-1923-000000000480 15247 1726867270.00502: variable 'ansible_search_path' from source: unknown 15247 1726867270.00505: variable 'ansible_search_path' from source: unknown 15247 1726867270.00507: calling self._execute() 15247 1726867270.00559: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867270.00571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867270.00591: variable 'omit' from source: magic vars 15247 1726867270.00958: variable 'ansible_distribution_major_version' from source: facts 15247 1726867270.00974: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867270.00986: variable 'omit' from source: magic vars 15247 1726867270.01031: variable 'omit' from source: magic vars 15247 1726867270.01132: variable 'profile' from source: play vars 15247 1726867270.01141: variable 'interface' from source: set_fact 15247 1726867270.01217: variable 'interface' from source: set_fact 15247 1726867270.01241: variable 'omit' from source: magic vars 15247 1726867270.01295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867270.01340: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867270.01365: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867270.01393: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867270.01409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867270.01449: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867270.01491: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867270.01494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867270.01591: Set connection var ansible_shell_executable to /bin/sh 15247 1726867270.01605: Set connection var ansible_connection to ssh 15247 1726867270.01612: Set connection var ansible_shell_type to sh 15247 1726867270.01623: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867270.01708: Set connection var ansible_timeout to 10 15247 1726867270.01712: Set connection var ansible_pipelining to False 15247 1726867270.01714: variable 'ansible_shell_executable' from source: unknown 15247 1726867270.01716: variable 'ansible_connection' from source: unknown 15247 1726867270.01718: variable 'ansible_module_compression' from source: unknown 15247 1726867270.01719: variable 'ansible_shell_type' from source: unknown 15247 1726867270.01721: variable 'ansible_shell_executable' from source: unknown 15247 1726867270.01723: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867270.01725: variable 'ansible_pipelining' from source: unknown 15247 1726867270.01727: variable 'ansible_timeout' from source: unknown 15247 1726867270.01729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867270.01858: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867270.01878: variable 'omit' from source: magic vars 15247 1726867270.01890: starting attempt loop 15247 1726867270.01897: running the handler 15247 1726867270.02015: variable 'lsr_net_profile_exists' from source: set_fact 15247 1726867270.02025: Evaluated conditional (not lsr_net_profile_exists): True 15247 1726867270.02038: handler run complete 15247 1726867270.02055: attempt loop complete, returning result 15247 1726867270.02062: _execute() done 15247 1726867270.02142: dumping result to json 15247 1726867270.02145: done dumping result, returning 15247 1726867270.02148: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'LSR-TST-br31' [0affcac9-a3a5-8ce3-1923-000000000480] 15247 1726867270.02150: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000480 15247 1726867270.02219: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000480 15247 1726867270.02223: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 15247 1726867270.02291: no more pending results, returning what we have 15247 1726867270.02294: results queue empty 15247 1726867270.02295: checking for any_errors_fatal 15247 1726867270.02304: done checking for any_errors_fatal 15247 1726867270.02305: checking for max_fail_percentage 15247 1726867270.02307: done checking for max_fail_percentage 15247 1726867270.02308: checking to see if all hosts have failed and the running result is not ok 15247 1726867270.02309: done checking to see if all hosts have failed 15247 1726867270.02310: getting the remaining hosts for this loop 15247 1726867270.02311: done getting the remaining hosts for this loop 15247 1726867270.02314: getting the next task for host managed_node2 15247 1726867270.02324: done getting next task for host managed_node2 15247 1726867270.02326: ^ task is: TASK: meta (flush_handlers) 15247 1726867270.02327: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867270.02331: getting variables 15247 1726867270.02333: in VariableManager get_vars() 15247 1726867270.02362: Calling all_inventory to load vars for managed_node2 15247 1726867270.02365: Calling groups_inventory to load vars for managed_node2 15247 1726867270.02369: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867270.02386: Calling all_plugins_play to load vars for managed_node2 15247 1726867270.02390: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867270.02393: Calling groups_plugins_play to load vars for managed_node2 15247 1726867270.03937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867270.05521: done with get_vars() 15247 1726867270.05542: done getting variables 15247 1726867270.05613: in VariableManager get_vars() 15247 1726867270.05621: Calling all_inventory to load vars for managed_node2 15247 1726867270.05623: Calling groups_inventory to load vars for managed_node2 15247 1726867270.05625: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867270.05629: Calling all_plugins_play to load vars for managed_node2 15247 1726867270.05631: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867270.05634: Calling groups_plugins_play to load vars for managed_node2 15247 1726867270.06889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867270.08444: done with get_vars() 15247 1726867270.08474: done queuing things up, now waiting for results queue to drain 15247 1726867270.08476: results queue empty 15247 1726867270.08479: checking for any_errors_fatal 15247 1726867270.08481: done checking for any_errors_fatal 15247 1726867270.08482: checking for max_fail_percentage 15247 1726867270.08483: done checking for max_fail_percentage 15247 1726867270.08483: checking to see if all hosts have failed and the running result is not ok 15247 1726867270.08488: done checking to see if all hosts have failed 15247 1726867270.08489: getting the remaining hosts for this loop 15247 1726867270.08490: done getting the remaining hosts for this loop 15247 1726867270.08493: getting the next task for host managed_node2 15247 1726867270.08497: done getting next task for host managed_node2 15247 1726867270.08500: ^ task is: TASK: meta (flush_handlers) 15247 1726867270.08501: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867270.08504: getting variables 15247 1726867270.08505: in VariableManager get_vars() 15247 1726867270.08513: Calling all_inventory to load vars for managed_node2 15247 1726867270.08514: Calling groups_inventory to load vars for managed_node2 15247 1726867270.08517: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867270.08521: Calling all_plugins_play to load vars for managed_node2 15247 1726867270.08523: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867270.08526: Calling groups_plugins_play to load vars for managed_node2 15247 1726867270.09641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867270.11164: done with get_vars() 15247 1726867270.11185: done getting variables 15247 1726867270.11247: in VariableManager get_vars() 15247 1726867270.11255: Calling all_inventory to load vars for managed_node2 15247 1726867270.11257: Calling groups_inventory to load vars for managed_node2 15247 1726867270.11260: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867270.11264: Calling all_plugins_play to load vars for managed_node2 15247 1726867270.11267: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867270.11269: Calling groups_plugins_play to load vars for managed_node2 15247 1726867270.12424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867270.14038: done with get_vars() 15247 1726867270.14066: done queuing things up, now waiting for results queue to drain 15247 1726867270.14068: results queue empty 15247 1726867270.14069: checking for any_errors_fatal 15247 1726867270.14070: done checking for any_errors_fatal 15247 1726867270.14071: checking for max_fail_percentage 15247 1726867270.14072: done checking for max_fail_percentage 15247 1726867270.14072: checking to see if all hosts have failed and the running result is not ok 15247 1726867270.14073: done checking to see if all hosts have failed 15247 1726867270.14074: getting the remaining hosts for this loop 15247 1726867270.14075: done getting the remaining hosts for this loop 15247 1726867270.14079: getting the next task for host managed_node2 15247 1726867270.14082: done getting next task for host managed_node2 15247 1726867270.14083: ^ task is: None 15247 1726867270.14084: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867270.14085: done queuing things up, now waiting for results queue to drain 15247 1726867270.14086: results queue empty 15247 1726867270.14086: checking for any_errors_fatal 15247 1726867270.14087: done checking for any_errors_fatal 15247 1726867270.14088: checking for max_fail_percentage 15247 1726867270.14089: done checking for max_fail_percentage 15247 1726867270.14089: checking to see if all hosts have failed and the running result is not ok 15247 1726867270.14090: done checking to see if all hosts have failed 15247 1726867270.14091: getting the next task for host managed_node2 15247 1726867270.14093: done getting next task for host managed_node2 15247 1726867270.14094: ^ task is: None 15247 1726867270.14095: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867270.14134: in VariableManager get_vars() 15247 1726867270.14149: done with get_vars() 15247 1726867270.14154: in VariableManager get_vars() 15247 1726867270.14163: done with get_vars() 15247 1726867270.14167: variable 'omit' from source: magic vars 15247 1726867270.14287: variable 'task' from source: play vars 15247 1726867270.14320: in VariableManager get_vars() 15247 1726867270.14330: done with get_vars() 15247 1726867270.14348: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_absent.yml] ************************* 15247 1726867270.14599: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15247 1726867270.14628: getting the remaining hosts for this loop 15247 1726867270.14629: done getting the remaining hosts for this loop 15247 1726867270.14631: getting the next task for host managed_node2 15247 1726867270.14634: done getting next task for host managed_node2 15247 1726867270.14636: ^ task is: TASK: Gathering Facts 15247 1726867270.14637: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867270.14639: getting variables 15247 1726867270.14640: in VariableManager get_vars() 15247 1726867270.14648: Calling all_inventory to load vars for managed_node2 15247 1726867270.14650: Calling groups_inventory to load vars for managed_node2 15247 1726867270.14652: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867270.14657: Calling all_plugins_play to load vars for managed_node2 15247 1726867270.14659: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867270.14662: Calling groups_plugins_play to load vars for managed_node2 15247 1726867270.15924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867270.17654: done with get_vars() 15247 1726867270.17673: done getting variables 15247 1726867270.17725: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Friday 20 September 2024 17:21:10 -0400 (0:00:00.180) 0:00:39.887 ****** 15247 1726867270.17750: entering _queue_task() for managed_node2/gather_facts 15247 1726867270.18067: worker is 1 (out of 1 available) 15247 1726867270.18281: exiting _queue_task() for managed_node2/gather_facts 15247 1726867270.18291: done queuing things up, now waiting for results queue to drain 15247 1726867270.18292: waiting for pending results... 15247 1726867270.18358: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15247 1726867270.18519: in run() - task 0affcac9-a3a5-8ce3-1923-0000000004c5 15247 1726867270.18523: variable 'ansible_search_path' from source: unknown 15247 1726867270.18539: calling self._execute() 15247 1726867270.18633: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867270.18644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867270.18655: variable 'omit' from source: magic vars 15247 1726867270.19061: variable 'ansible_distribution_major_version' from source: facts 15247 1726867270.19064: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867270.19066: variable 'omit' from source: magic vars 15247 1726867270.19079: variable 'omit' from source: magic vars 15247 1726867270.19119: variable 'omit' from source: magic vars 15247 1726867270.19166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867270.19211: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867270.19283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867270.19287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867270.19289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867270.19305: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867270.19312: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867270.19320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867270.19428: Set connection var ansible_shell_executable to /bin/sh 15247 1726867270.19436: Set connection var ansible_connection to ssh 15247 1726867270.19442: Set connection var ansible_shell_type to sh 15247 1726867270.19453: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867270.19463: Set connection var ansible_timeout to 10 15247 1726867270.19473: Set connection var ansible_pipelining to False 15247 1726867270.19609: variable 'ansible_shell_executable' from source: unknown 15247 1726867270.19612: variable 'ansible_connection' from source: unknown 15247 1726867270.19614: variable 'ansible_module_compression' from source: unknown 15247 1726867270.19617: variable 'ansible_shell_type' from source: unknown 15247 1726867270.19619: variable 'ansible_shell_executable' from source: unknown 15247 1726867270.19621: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867270.19623: variable 'ansible_pipelining' from source: unknown 15247 1726867270.19625: variable 'ansible_timeout' from source: unknown 15247 1726867270.19627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867270.19725: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867270.19740: variable 'omit' from source: magic vars 15247 1726867270.19749: starting attempt loop 15247 1726867270.19755: running the handler 15247 1726867270.19775: variable 'ansible_facts' from source: unknown 15247 1726867270.19798: _low_level_execute_command(): starting 15247 1726867270.19810: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867270.20533: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867270.20537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867270.20540: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 15247 1726867270.20542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867270.20544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867270.20629: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867270.20668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867270.22373: stdout chunk (state=3): >>>/root <<< 15247 1726867270.22526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867270.22558: stdout chunk (state=3): >>><<< 15247 1726867270.22561: stderr chunk (state=3): >>><<< 15247 1726867270.22580: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867270.22599: _low_level_execute_command(): starting 15247 1726867270.22674: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867270.2258642-17088-23970827500094 `" && echo ansible-tmp-1726867270.2258642-17088-23970827500094="` echo /root/.ansible/tmp/ansible-tmp-1726867270.2258642-17088-23970827500094 `" ) && sleep 0' 15247 1726867270.23272: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867270.23289: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867270.23303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867270.23320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867270.23337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867270.23397: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867270.23452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867270.23471: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867270.23487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867270.23561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867270.25501: stdout chunk (state=3): >>>ansible-tmp-1726867270.2258642-17088-23970827500094=/root/.ansible/tmp/ansible-tmp-1726867270.2258642-17088-23970827500094 <<< 15247 1726867270.25639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867270.25658: stdout chunk (state=3): >>><<< 15247 1726867270.25672: stderr chunk (state=3): >>><<< 15247 1726867270.25883: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867270.2258642-17088-23970827500094=/root/.ansible/tmp/ansible-tmp-1726867270.2258642-17088-23970827500094 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867270.25887: variable 'ansible_module_compression' from source: unknown 15247 1726867270.25889: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15247 1726867270.25891: variable 'ansible_facts' from source: unknown 15247 1726867270.26241: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867270.2258642-17088-23970827500094/AnsiballZ_setup.py 15247 1726867270.26467: Sending initial data 15247 1726867270.26470: Sent initial data (153 bytes) 15247 1726867270.27062: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867270.27075: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867270.27112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867270.27129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867270.27216: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867270.27243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867270.27257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867270.27281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867270.27355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867270.28932: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867270.28994: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867270.29059: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpth873_6t /root/.ansible/tmp/ansible-tmp-1726867270.2258642-17088-23970827500094/AnsiballZ_setup.py <<< 15247 1726867270.29073: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867270.2258642-17088-23970827500094/AnsiballZ_setup.py" <<< 15247 1726867270.29111: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 15247 1726867270.29138: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpth873_6t" to remote "/root/.ansible/tmp/ansible-tmp-1726867270.2258642-17088-23970827500094/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867270.2258642-17088-23970827500094/AnsiballZ_setup.py" <<< 15247 1726867270.30620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867270.30775: stderr chunk (state=3): >>><<< 15247 1726867270.30780: stdout chunk (state=3): >>><<< 15247 1726867270.30783: done transferring module to remote 15247 1726867270.30785: _low_level_execute_command(): starting 15247 1726867270.30787: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867270.2258642-17088-23970827500094/ /root/.ansible/tmp/ansible-tmp-1726867270.2258642-17088-23970827500094/AnsiballZ_setup.py && sleep 0' 15247 1726867270.31311: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867270.31327: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867270.31338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867270.31392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867270.31448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867270.31473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867270.31545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867270.33368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867270.33384: stdout chunk (state=3): >>><<< 15247 1726867270.33399: stderr chunk (state=3): >>><<< 15247 1726867270.33497: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867270.33500: _low_level_execute_command(): starting 15247 1726867270.33503: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867270.2258642-17088-23970827500094/AnsiballZ_setup.py && sleep 0' 15247 1726867270.34047: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867270.34061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867270.34075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867270.34095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867270.34144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867270.34208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867270.34244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867270.34274: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867270.34326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867270.98422: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.61865234375, "5m": 0.40576171875, "15m": 0.20068359375}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2963, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 568, "free": 2963}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 508, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794844672, "block_size": 4096, "block_total": 65519099, "block_available": 63914757, "block_used": 1604342, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "10", "epoch": "1726867270", "epoch_int": "1726867270", "date": "2024-09-20", "time": "17:21:10", "iso8601_micro": "2024-09-20T21:21:10.975174Z", "iso8601": "2024-09-20T21:21:10Z", "iso8601_basic": "20240920T172110975174", "iso8601_basic_short": "20240920T172110", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15247 1726867271.00585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867271.00588: stdout chunk (state=3): >>><<< 15247 1726867271.00591: stderr chunk (state=3): >>><<< 15247 1726867271.00593: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_loadavg": {"1m": 0.61865234375, "5m": 0.40576171875, "15m": 0.20068359375}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2963, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 568, "free": 2963}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 508, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794844672, "block_size": 4096, "block_total": 65519099, "block_available": 63914757, "block_used": 1604342, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "10", "epoch": "1726867270", "epoch_int": "1726867270", "date": "2024-09-20", "time": "17:21:10", "iso8601_micro": "2024-09-20T21:21:10.975174Z", "iso8601": "2024-09-20T21:21:10Z", "iso8601_basic": "20240920T172110975174", "iso8601_basic_short": "20240920T172110", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867271.01287: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867270.2258642-17088-23970827500094/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867271.01291: _low_level_execute_command(): starting 15247 1726867271.01293: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867270.2258642-17088-23970827500094/ > /dev/null 2>&1 && sleep 0' 15247 1726867271.02597: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867271.02613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867271.02632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867271.02706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867271.04599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867271.04618: stdout chunk (state=3): >>><<< 15247 1726867271.04631: stderr chunk (state=3): >>><<< 15247 1726867271.04652: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867271.04667: handler run complete 15247 1726867271.04993: variable 'ansible_facts' from source: unknown 15247 1726867271.05314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867271.05755: variable 'ansible_facts' from source: unknown 15247 1726867271.05904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867271.06219: attempt loop complete, returning result 15247 1726867271.06259: _execute() done 15247 1726867271.06267: dumping result to json 15247 1726867271.06313: done dumping result, returning 15247 1726867271.06370: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-8ce3-1923-0000000004c5] 15247 1726867271.06383: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000004c5 15247 1726867271.07156: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000004c5 15247 1726867271.07160: WORKER PROCESS EXITING ok: [managed_node2] 15247 1726867271.07547: no more pending results, returning what we have 15247 1726867271.07550: results queue empty 15247 1726867271.07551: checking for any_errors_fatal 15247 1726867271.07552: done checking for any_errors_fatal 15247 1726867271.07553: checking for max_fail_percentage 15247 1726867271.07555: done checking for max_fail_percentage 15247 1726867271.07555: checking to see if all hosts have failed and the running result is not ok 15247 1726867271.07556: done checking to see if all hosts have failed 15247 1726867271.07557: getting the remaining hosts for this loop 15247 1726867271.07558: done getting the remaining hosts for this loop 15247 1726867271.07562: getting the next task for host managed_node2 15247 1726867271.07566: done getting next task for host managed_node2 15247 1726867271.07568: ^ task is: TASK: meta (flush_handlers) 15247 1726867271.07570: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867271.07574: getting variables 15247 1726867271.07575: in VariableManager get_vars() 15247 1726867271.07599: Calling all_inventory to load vars for managed_node2 15247 1726867271.07601: Calling groups_inventory to load vars for managed_node2 15247 1726867271.07605: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867271.07616: Calling all_plugins_play to load vars for managed_node2 15247 1726867271.07619: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867271.07621: Calling groups_plugins_play to load vars for managed_node2 15247 1726867271.10657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867271.14492: done with get_vars() 15247 1726867271.14517: done getting variables 15247 1726867271.14707: in VariableManager get_vars() 15247 1726867271.14721: Calling all_inventory to load vars for managed_node2 15247 1726867271.14723: Calling groups_inventory to load vars for managed_node2 15247 1726867271.14725: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867271.14731: Calling all_plugins_play to load vars for managed_node2 15247 1726867271.14733: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867271.14736: Calling groups_plugins_play to load vars for managed_node2 15247 1726867271.16798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867271.18531: done with get_vars() 15247 1726867271.18563: done queuing things up, now waiting for results queue to drain 15247 1726867271.18565: results queue empty 15247 1726867271.18566: checking for any_errors_fatal 15247 1726867271.18570: done checking for any_errors_fatal 15247 1726867271.18571: checking for max_fail_percentage 15247 1726867271.18572: done checking for max_fail_percentage 15247 1726867271.18573: checking to see if all hosts have failed and the running result is not ok 15247 1726867271.18574: done checking to see if all hosts have failed 15247 1726867271.18580: getting the remaining hosts for this loop 15247 1726867271.18581: done getting the remaining hosts for this loop 15247 1726867271.18584: getting the next task for host managed_node2 15247 1726867271.18588: done getting next task for host managed_node2 15247 1726867271.18590: ^ task is: TASK: Include the task '{{ task }}' 15247 1726867271.18592: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867271.18594: getting variables 15247 1726867271.18595: in VariableManager get_vars() 15247 1726867271.18604: Calling all_inventory to load vars for managed_node2 15247 1726867271.18606: Calling groups_inventory to load vars for managed_node2 15247 1726867271.18609: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867271.18616: Calling all_plugins_play to load vars for managed_node2 15247 1726867271.18619: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867271.18623: Calling groups_plugins_play to load vars for managed_node2 15247 1726867271.19900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867271.21756: done with get_vars() 15247 1726867271.21775: done getting variables 15247 1726867271.21948: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_absent.yml'] *********************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Friday 20 September 2024 17:21:11 -0400 (0:00:01.042) 0:00:40.929 ****** 15247 1726867271.21979: entering _queue_task() for managed_node2/include_tasks 15247 1726867271.22434: worker is 1 (out of 1 available) 15247 1726867271.22445: exiting _queue_task() for managed_node2/include_tasks 15247 1726867271.22455: done queuing things up, now waiting for results queue to drain 15247 1726867271.22456: waiting for pending results... 15247 1726867271.22776: running TaskExecutor() for managed_node2/TASK: Include the task 'tasks/assert_device_absent.yml' 15247 1726867271.22782: in run() - task 0affcac9-a3a5-8ce3-1923-000000000077 15247 1726867271.22786: variable 'ansible_search_path' from source: unknown 15247 1726867271.22812: calling self._execute() 15247 1726867271.22925: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867271.22937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867271.22953: variable 'omit' from source: magic vars 15247 1726867271.23386: variable 'ansible_distribution_major_version' from source: facts 15247 1726867271.23409: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867271.23489: variable 'task' from source: play vars 15247 1726867271.23528: variable 'task' from source: play vars 15247 1726867271.23542: _execute() done 15247 1726867271.23552: dumping result to json 15247 1726867271.23560: done dumping result, returning 15247 1726867271.23569: done running TaskExecutor() for managed_node2/TASK: Include the task 'tasks/assert_device_absent.yml' [0affcac9-a3a5-8ce3-1923-000000000077] 15247 1726867271.23579: sending task result for task 0affcac9-a3a5-8ce3-1923-000000000077 15247 1726867271.23987: no more pending results, returning what we have 15247 1726867271.23992: in VariableManager get_vars() 15247 1726867271.24020: Calling all_inventory to load vars for managed_node2 15247 1726867271.24022: Calling groups_inventory to load vars for managed_node2 15247 1726867271.24025: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867271.24033: Calling all_plugins_play to load vars for managed_node2 15247 1726867271.24036: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867271.24038: Calling groups_plugins_play to load vars for managed_node2 15247 1726867271.24691: done sending task result for task 0affcac9-a3a5-8ce3-1923-000000000077 15247 1726867271.24695: WORKER PROCESS EXITING 15247 1726867271.25401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867271.27030: done with get_vars() 15247 1726867271.27050: variable 'ansible_search_path' from source: unknown 15247 1726867271.27064: we have included files to process 15247 1726867271.27065: generating all_blocks data 15247 1726867271.27067: done generating all_blocks data 15247 1726867271.27068: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15247 1726867271.27069: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15247 1726867271.27071: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 15247 1726867271.27184: in VariableManager get_vars() 15247 1726867271.27202: done with get_vars() 15247 1726867271.27324: done processing included file 15247 1726867271.27326: iterating over new_blocks loaded from include file 15247 1726867271.27327: in VariableManager get_vars() 15247 1726867271.27341: done with get_vars() 15247 1726867271.27342: filtering new block on tags 15247 1726867271.27359: done filtering new block on tags 15247 1726867271.27361: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 15247 1726867271.27367: extending task lists for all hosts with included blocks 15247 1726867271.27402: done extending task lists 15247 1726867271.27403: done processing included files 15247 1726867271.27404: results queue empty 15247 1726867271.27404: checking for any_errors_fatal 15247 1726867271.27406: done checking for any_errors_fatal 15247 1726867271.27406: checking for max_fail_percentage 15247 1726867271.27407: done checking for max_fail_percentage 15247 1726867271.27408: checking to see if all hosts have failed and the running result is not ok 15247 1726867271.27409: done checking to see if all hosts have failed 15247 1726867271.27409: getting the remaining hosts for this loop 15247 1726867271.27410: done getting the remaining hosts for this loop 15247 1726867271.27413: getting the next task for host managed_node2 15247 1726867271.27420: done getting next task for host managed_node2 15247 1726867271.27423: ^ task is: TASK: Include the task 'get_interface_stat.yml' 15247 1726867271.27425: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867271.27427: getting variables 15247 1726867271.27428: in VariableManager get_vars() 15247 1726867271.27436: Calling all_inventory to load vars for managed_node2 15247 1726867271.27438: Calling groups_inventory to load vars for managed_node2 15247 1726867271.27440: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867271.27445: Calling all_plugins_play to load vars for managed_node2 15247 1726867271.27448: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867271.27450: Calling groups_plugins_play to load vars for managed_node2 15247 1726867271.34190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867271.37554: done with get_vars() 15247 1726867271.37716: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 17:21:11 -0400 (0:00:00.158) 0:00:41.087 ****** 15247 1726867271.37792: entering _queue_task() for managed_node2/include_tasks 15247 1726867271.38549: worker is 1 (out of 1 available) 15247 1726867271.38562: exiting _queue_task() for managed_node2/include_tasks 15247 1726867271.38575: done queuing things up, now waiting for results queue to drain 15247 1726867271.38576: waiting for pending results... 15247 1726867271.39197: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 15247 1726867271.39315: in run() - task 0affcac9-a3a5-8ce3-1923-0000000004d6 15247 1726867271.39336: variable 'ansible_search_path' from source: unknown 15247 1726867271.39882: variable 'ansible_search_path' from source: unknown 15247 1726867271.39887: calling self._execute() 15247 1726867271.39891: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867271.39894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867271.39897: variable 'omit' from source: magic vars 15247 1726867271.40911: variable 'ansible_distribution_major_version' from source: facts 15247 1726867271.41098: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867271.41113: _execute() done 15247 1726867271.41123: dumping result to json 15247 1726867271.41132: done dumping result, returning 15247 1726867271.41145: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-8ce3-1923-0000000004d6] 15247 1726867271.41157: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000004d6 15247 1726867271.41291: no more pending results, returning what we have 15247 1726867271.41297: in VariableManager get_vars() 15247 1726867271.41338: Calling all_inventory to load vars for managed_node2 15247 1726867271.41341: Calling groups_inventory to load vars for managed_node2 15247 1726867271.41345: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867271.41359: Calling all_plugins_play to load vars for managed_node2 15247 1726867271.41362: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867271.41365: Calling groups_plugins_play to load vars for managed_node2 15247 1726867271.42774: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000004d6 15247 1726867271.43384: WORKER PROCESS EXITING 15247 1726867271.44206: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867271.47568: done with get_vars() 15247 1726867271.47593: variable 'ansible_search_path' from source: unknown 15247 1726867271.47594: variable 'ansible_search_path' from source: unknown 15247 1726867271.47605: variable 'task' from source: play vars 15247 1726867271.47829: variable 'task' from source: play vars 15247 1726867271.47863: we have included files to process 15247 1726867271.47865: generating all_blocks data 15247 1726867271.47866: done generating all_blocks data 15247 1726867271.47868: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15247 1726867271.47869: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15247 1726867271.47871: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 15247 1726867271.48283: done processing included file 15247 1726867271.48285: iterating over new_blocks loaded from include file 15247 1726867271.48287: in VariableManager get_vars() 15247 1726867271.48300: done with get_vars() 15247 1726867271.48302: filtering new block on tags 15247 1726867271.48434: done filtering new block on tags 15247 1726867271.48438: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 15247 1726867271.48443: extending task lists for all hosts with included blocks 15247 1726867271.48615: done extending task lists 15247 1726867271.48617: done processing included files 15247 1726867271.48618: results queue empty 15247 1726867271.48618: checking for any_errors_fatal 15247 1726867271.48622: done checking for any_errors_fatal 15247 1726867271.48623: checking for max_fail_percentage 15247 1726867271.48624: done checking for max_fail_percentage 15247 1726867271.48625: checking to see if all hosts have failed and the running result is not ok 15247 1726867271.48626: done checking to see if all hosts have failed 15247 1726867271.48626: getting the remaining hosts for this loop 15247 1726867271.48627: done getting the remaining hosts for this loop 15247 1726867271.48630: getting the next task for host managed_node2 15247 1726867271.48634: done getting next task for host managed_node2 15247 1726867271.48636: ^ task is: TASK: Get stat for interface {{ interface }} 15247 1726867271.48639: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867271.48641: getting variables 15247 1726867271.48642: in VariableManager get_vars() 15247 1726867271.48765: Calling all_inventory to load vars for managed_node2 15247 1726867271.48768: Calling groups_inventory to load vars for managed_node2 15247 1726867271.48770: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867271.48776: Calling all_plugins_play to load vars for managed_node2 15247 1726867271.48866: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867271.48870: Calling groups_plugins_play to load vars for managed_node2 15247 1726867271.51370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867271.54878: done with get_vars() 15247 1726867271.54905: done getting variables 15247 1726867271.55159: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:21:11 -0400 (0:00:00.173) 0:00:41.261 ****** 15247 1726867271.55192: entering _queue_task() for managed_node2/stat 15247 1726867271.55948: worker is 1 (out of 1 available) 15247 1726867271.56078: exiting _queue_task() for managed_node2/stat 15247 1726867271.56090: done queuing things up, now waiting for results queue to drain 15247 1726867271.56092: waiting for pending results... 15247 1726867271.56390: running TaskExecutor() for managed_node2/TASK: Get stat for interface LSR-TST-br31 15247 1726867271.56784: in run() - task 0affcac9-a3a5-8ce3-1923-0000000004e1 15247 1726867271.56788: variable 'ansible_search_path' from source: unknown 15247 1726867271.56791: variable 'ansible_search_path' from source: unknown 15247 1726867271.57085: calling self._execute() 15247 1726867271.57088: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867271.57091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867271.57107: variable 'omit' from source: magic vars 15247 1726867271.57863: variable 'ansible_distribution_major_version' from source: facts 15247 1726867271.57884: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867271.57898: variable 'omit' from source: magic vars 15247 1726867271.58060: variable 'omit' from source: magic vars 15247 1726867271.58158: variable 'interface' from source: set_fact 15247 1726867271.58384: variable 'omit' from source: magic vars 15247 1726867271.58388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867271.58390: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867271.58494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867271.58519: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867271.58538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867271.58572: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867271.58582: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867271.58605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867271.58825: Set connection var ansible_shell_executable to /bin/sh 15247 1726867271.58833: Set connection var ansible_connection to ssh 15247 1726867271.58840: Set connection var ansible_shell_type to sh 15247 1726867271.58982: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867271.58985: Set connection var ansible_timeout to 10 15247 1726867271.58988: Set connection var ansible_pipelining to False 15247 1726867271.58990: variable 'ansible_shell_executable' from source: unknown 15247 1726867271.58993: variable 'ansible_connection' from source: unknown 15247 1726867271.58995: variable 'ansible_module_compression' from source: unknown 15247 1726867271.58997: variable 'ansible_shell_type' from source: unknown 15247 1726867271.59005: variable 'ansible_shell_executable' from source: unknown 15247 1726867271.59013: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867271.59021: variable 'ansible_pipelining' from source: unknown 15247 1726867271.59032: variable 'ansible_timeout' from source: unknown 15247 1726867271.59041: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867271.59454: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 15247 1726867271.59484: variable 'omit' from source: magic vars 15247 1726867271.59693: starting attempt loop 15247 1726867271.59696: running the handler 15247 1726867271.59698: _low_level_execute_command(): starting 15247 1726867271.59700: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867271.60912: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867271.60930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 15247 1726867271.60995: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867271.61112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867271.61124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867271.61222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867271.61264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867271.62949: stdout chunk (state=3): >>>/root <<< 15247 1726867271.63053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867271.63094: stderr chunk (state=3): >>><<< 15247 1726867271.63104: stdout chunk (state=3): >>><<< 15247 1726867271.63384: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867271.63388: _low_level_execute_command(): starting 15247 1726867271.63391: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867271.6329923-17130-125958511965468 `" && echo ansible-tmp-1726867271.6329923-17130-125958511965468="` echo /root/.ansible/tmp/ansible-tmp-1726867271.6329923-17130-125958511965468 `" ) && sleep 0' 15247 1726867271.64571: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867271.64797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867271.64876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867271.66866: stdout chunk (state=3): >>>ansible-tmp-1726867271.6329923-17130-125958511965468=/root/.ansible/tmp/ansible-tmp-1726867271.6329923-17130-125958511965468 <<< 15247 1726867271.67262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867271.67266: stdout chunk (state=3): >>><<< 15247 1726867271.67269: stderr chunk (state=3): >>><<< 15247 1726867271.67345: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867271.6329923-17130-125958511965468=/root/.ansible/tmp/ansible-tmp-1726867271.6329923-17130-125958511965468 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867271.67350: variable 'ansible_module_compression' from source: unknown 15247 1726867271.67418: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 15247 1726867271.67609: variable 'ansible_facts' from source: unknown 15247 1726867271.67697: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867271.6329923-17130-125958511965468/AnsiballZ_stat.py 15247 1726867271.68016: Sending initial data 15247 1726867271.68019: Sent initial data (153 bytes) 15247 1726867271.69124: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867271.69260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867271.69272: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 15247 1726867271.69281: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address <<< 15247 1726867271.69301: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 15247 1726867271.69309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867271.69323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867271.69335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867271.69366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867271.69374: stderr chunk (state=3): >>>debug2: match found <<< 15247 1726867271.69387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867271.69551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867271.69579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867271.69731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867271.71295: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867271.71345: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867271.71411: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpoww7i2vt /root/.ansible/tmp/ansible-tmp-1726867271.6329923-17130-125958511965468/AnsiballZ_stat.py <<< 15247 1726867271.71417: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867271.6329923-17130-125958511965468/AnsiballZ_stat.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpoww7i2vt" to remote "/root/.ansible/tmp/ansible-tmp-1726867271.6329923-17130-125958511965468/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867271.6329923-17130-125958511965468/AnsiballZ_stat.py" <<< 15247 1726867271.72861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867271.72865: stdout chunk (state=3): >>><<< 15247 1726867271.72873: stderr chunk (state=3): >>><<< 15247 1726867271.72897: done transferring module to remote 15247 1726867271.72911: _low_level_execute_command(): starting 15247 1726867271.72916: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867271.6329923-17130-125958511965468/ /root/.ansible/tmp/ansible-tmp-1726867271.6329923-17130-125958511965468/AnsiballZ_stat.py && sleep 0' 15247 1726867271.74348: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 15247 1726867271.74383: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867271.74401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867271.74506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867271.74527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867271.74732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867271.77079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867271.77083: stdout chunk (state=3): >>><<< 15247 1726867271.77085: stderr chunk (state=3): >>><<< 15247 1726867271.77088: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867271.77091: _low_level_execute_command(): starting 15247 1726867271.77093: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867271.6329923-17130-125958511965468/AnsiballZ_stat.py && sleep 0' 15247 1726867271.77759: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867271.77804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867271.77887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867271.93639: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 15247 1726867271.95090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867271.95127: stdout chunk (state=3): >>><<< 15247 1726867271.95154: stderr chunk (state=3): >>><<< 15247 1726867271.95176: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867271.95212: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867271.6329923-17130-125958511965468/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867271.95241: _low_level_execute_command(): starting 15247 1726867271.95287: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867271.6329923-17130-125958511965468/ > /dev/null 2>&1 && sleep 0' 15247 1726867271.95850: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867271.95863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 15247 1726867271.95874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867271.95935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867271.95951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867271.96000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867271.97844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867271.97864: stderr chunk (state=3): >>><<< 15247 1726867271.97867: stdout chunk (state=3): >>><<< 15247 1726867271.97944: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867271.97951: handler run complete 15247 1726867271.97953: attempt loop complete, returning result 15247 1726867271.97956: _execute() done 15247 1726867271.97957: dumping result to json 15247 1726867271.97959: done dumping result, returning 15247 1726867271.97961: done running TaskExecutor() for managed_node2/TASK: Get stat for interface LSR-TST-br31 [0affcac9-a3a5-8ce3-1923-0000000004e1] 15247 1726867271.97962: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000004e1 15247 1726867271.98038: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000004e1 15247 1726867271.98042: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 15247 1726867271.98228: no more pending results, returning what we have 15247 1726867271.98232: results queue empty 15247 1726867271.98233: checking for any_errors_fatal 15247 1726867271.98235: done checking for any_errors_fatal 15247 1726867271.98236: checking for max_fail_percentage 15247 1726867271.98237: done checking for max_fail_percentage 15247 1726867271.98238: checking to see if all hosts have failed and the running result is not ok 15247 1726867271.98239: done checking to see if all hosts have failed 15247 1726867271.98240: getting the remaining hosts for this loop 15247 1726867271.98241: done getting the remaining hosts for this loop 15247 1726867271.98248: getting the next task for host managed_node2 15247 1726867271.98255: done getting next task for host managed_node2 15247 1726867271.98258: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 15247 1726867271.98261: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867271.98265: getting variables 15247 1726867271.98269: in VariableManager get_vars() 15247 1726867271.98305: Calling all_inventory to load vars for managed_node2 15247 1726867271.98308: Calling groups_inventory to load vars for managed_node2 15247 1726867271.98311: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867271.98323: Calling all_plugins_play to load vars for managed_node2 15247 1726867271.98328: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867271.98331: Calling groups_plugins_play to load vars for managed_node2 15247 1726867271.99766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867272.01623: done with get_vars() 15247 1726867272.01647: done getting variables 15247 1726867272.01731: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 15247 1726867272.01905: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 17:21:12 -0400 (0:00:00.467) 0:00:41.729 ****** 15247 1726867272.01953: entering _queue_task() for managed_node2/assert 15247 1726867272.02453: worker is 1 (out of 1 available) 15247 1726867272.02470: exiting _queue_task() for managed_node2/assert 15247 1726867272.02485: done queuing things up, now waiting for results queue to drain 15247 1726867272.02486: waiting for pending results... 15247 1726867272.02741: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' 15247 1726867272.02835: in run() - task 0affcac9-a3a5-8ce3-1923-0000000004d7 15247 1726867272.02856: variable 'ansible_search_path' from source: unknown 15247 1726867272.02860: variable 'ansible_search_path' from source: unknown 15247 1726867272.02930: calling self._execute() 15247 1726867272.03040: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867272.03049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867272.03053: variable 'omit' from source: magic vars 15247 1726867272.03530: variable 'ansible_distribution_major_version' from source: facts 15247 1726867272.03534: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867272.03548: variable 'omit' from source: magic vars 15247 1726867272.03576: variable 'omit' from source: magic vars 15247 1726867272.03644: variable 'interface' from source: set_fact 15247 1726867272.03659: variable 'omit' from source: magic vars 15247 1726867272.03701: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867272.03730: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867272.03746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867272.03759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867272.03768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867272.03794: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867272.03797: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867272.03800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867272.03870: Set connection var ansible_shell_executable to /bin/sh 15247 1726867272.03874: Set connection var ansible_connection to ssh 15247 1726867272.03876: Set connection var ansible_shell_type to sh 15247 1726867272.03880: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867272.03888: Set connection var ansible_timeout to 10 15247 1726867272.03895: Set connection var ansible_pipelining to False 15247 1726867272.03912: variable 'ansible_shell_executable' from source: unknown 15247 1726867272.03915: variable 'ansible_connection' from source: unknown 15247 1726867272.03920: variable 'ansible_module_compression' from source: unknown 15247 1726867272.03922: variable 'ansible_shell_type' from source: unknown 15247 1726867272.03925: variable 'ansible_shell_executable' from source: unknown 15247 1726867272.03927: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867272.03932: variable 'ansible_pipelining' from source: unknown 15247 1726867272.03934: variable 'ansible_timeout' from source: unknown 15247 1726867272.03937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867272.04048: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867272.04057: variable 'omit' from source: magic vars 15247 1726867272.04065: starting attempt loop 15247 1726867272.04068: running the handler 15247 1726867272.04164: variable 'interface_stat' from source: set_fact 15247 1726867272.04174: Evaluated conditional (not interface_stat.stat.exists): True 15247 1726867272.04179: handler run complete 15247 1726867272.04189: attempt loop complete, returning result 15247 1726867272.04191: _execute() done 15247 1726867272.04194: dumping result to json 15247 1726867272.04196: done dumping result, returning 15247 1726867272.04203: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'LSR-TST-br31' [0affcac9-a3a5-8ce3-1923-0000000004d7] 15247 1726867272.04213: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000004d7 15247 1726867272.04288: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000004d7 15247 1726867272.04291: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 15247 1726867272.04357: no more pending results, returning what we have 15247 1726867272.04361: results queue empty 15247 1726867272.04361: checking for any_errors_fatal 15247 1726867272.04370: done checking for any_errors_fatal 15247 1726867272.04371: checking for max_fail_percentage 15247 1726867272.04373: done checking for max_fail_percentage 15247 1726867272.04374: checking to see if all hosts have failed and the running result is not ok 15247 1726867272.04375: done checking to see if all hosts have failed 15247 1726867272.04376: getting the remaining hosts for this loop 15247 1726867272.04379: done getting the remaining hosts for this loop 15247 1726867272.04382: getting the next task for host managed_node2 15247 1726867272.04403: done getting next task for host managed_node2 15247 1726867272.04406: ^ task is: TASK: meta (flush_handlers) 15247 1726867272.04408: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867272.04411: getting variables 15247 1726867272.04413: in VariableManager get_vars() 15247 1726867272.04438: Calling all_inventory to load vars for managed_node2 15247 1726867272.04440: Calling groups_inventory to load vars for managed_node2 15247 1726867272.04443: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867272.04455: Calling all_plugins_play to load vars for managed_node2 15247 1726867272.04458: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867272.04461: Calling groups_plugins_play to load vars for managed_node2 15247 1726867272.05543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867272.07050: done with get_vars() 15247 1726867272.07081: done getting variables 15247 1726867272.07159: in VariableManager get_vars() 15247 1726867272.07171: Calling all_inventory to load vars for managed_node2 15247 1726867272.07173: Calling groups_inventory to load vars for managed_node2 15247 1726867272.07176: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867272.07186: Calling all_plugins_play to load vars for managed_node2 15247 1726867272.07190: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867272.07193: Calling groups_plugins_play to load vars for managed_node2 15247 1726867272.08401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867272.09423: done with get_vars() 15247 1726867272.09441: done queuing things up, now waiting for results queue to drain 15247 1726867272.09442: results queue empty 15247 1726867272.09443: checking for any_errors_fatal 15247 1726867272.09445: done checking for any_errors_fatal 15247 1726867272.09446: checking for max_fail_percentage 15247 1726867272.09447: done checking for max_fail_percentage 15247 1726867272.09447: checking to see if all hosts have failed and the running result is not ok 15247 1726867272.09448: done checking to see if all hosts have failed 15247 1726867272.09452: getting the remaining hosts for this loop 15247 1726867272.09453: done getting the remaining hosts for this loop 15247 1726867272.09454: getting the next task for host managed_node2 15247 1726867272.09457: done getting next task for host managed_node2 15247 1726867272.09458: ^ task is: TASK: meta (flush_handlers) 15247 1726867272.09459: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867272.09461: getting variables 15247 1726867272.09461: in VariableManager get_vars() 15247 1726867272.09467: Calling all_inventory to load vars for managed_node2 15247 1726867272.09468: Calling groups_inventory to load vars for managed_node2 15247 1726867272.09470: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867272.09474: Calling all_plugins_play to load vars for managed_node2 15247 1726867272.09476: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867272.09479: Calling groups_plugins_play to load vars for managed_node2 15247 1726867272.10333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867272.11508: done with get_vars() 15247 1726867272.11525: done getting variables 15247 1726867272.11573: in VariableManager get_vars() 15247 1726867272.11581: Calling all_inventory to load vars for managed_node2 15247 1726867272.11583: Calling groups_inventory to load vars for managed_node2 15247 1726867272.11584: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867272.11590: Calling all_plugins_play to load vars for managed_node2 15247 1726867272.11592: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867272.11595: Calling groups_plugins_play to load vars for managed_node2 15247 1726867272.12630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867272.14154: done with get_vars() 15247 1726867272.14183: done queuing things up, now waiting for results queue to drain 15247 1726867272.14185: results queue empty 15247 1726867272.14189: checking for any_errors_fatal 15247 1726867272.14190: done checking for any_errors_fatal 15247 1726867272.14191: checking for max_fail_percentage 15247 1726867272.14192: done checking for max_fail_percentage 15247 1726867272.14193: checking to see if all hosts have failed and the running result is not ok 15247 1726867272.14194: done checking to see if all hosts have failed 15247 1726867272.14195: getting the remaining hosts for this loop 15247 1726867272.14196: done getting the remaining hosts for this loop 15247 1726867272.14198: getting the next task for host managed_node2 15247 1726867272.14202: done getting next task for host managed_node2 15247 1726867272.14202: ^ task is: None 15247 1726867272.14204: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867272.14205: done queuing things up, now waiting for results queue to drain 15247 1726867272.14206: results queue empty 15247 1726867272.14207: checking for any_errors_fatal 15247 1726867272.14207: done checking for any_errors_fatal 15247 1726867272.14208: checking for max_fail_percentage 15247 1726867272.14209: done checking for max_fail_percentage 15247 1726867272.14209: checking to see if all hosts have failed and the running result is not ok 15247 1726867272.14210: done checking to see if all hosts have failed 15247 1726867272.14211: getting the next task for host managed_node2 15247 1726867272.14216: done getting next task for host managed_node2 15247 1726867272.14216: ^ task is: None 15247 1726867272.14219: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867272.14273: in VariableManager get_vars() 15247 1726867272.14288: done with get_vars() 15247 1726867272.14292: in VariableManager get_vars() 15247 1726867272.14297: done with get_vars() 15247 1726867272.14300: variable 'omit' from source: magic vars 15247 1726867272.14324: in VariableManager get_vars() 15247 1726867272.14330: done with get_vars() 15247 1726867272.14355: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 15247 1726867272.14546: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 15247 1726867272.14566: getting the remaining hosts for this loop 15247 1726867272.14567: done getting the remaining hosts for this loop 15247 1726867272.14568: getting the next task for host managed_node2 15247 1726867272.14570: done getting next task for host managed_node2 15247 1726867272.14572: ^ task is: TASK: Gathering Facts 15247 1726867272.14573: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867272.14575: getting variables 15247 1726867272.14575: in VariableManager get_vars() 15247 1726867272.14583: Calling all_inventory to load vars for managed_node2 15247 1726867272.14584: Calling groups_inventory to load vars for managed_node2 15247 1726867272.14585: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867272.14589: Calling all_plugins_play to load vars for managed_node2 15247 1726867272.14590: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867272.14592: Calling groups_plugins_play to load vars for managed_node2 15247 1726867272.15398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867272.16717: done with get_vars() 15247 1726867272.16739: done getting variables 15247 1726867272.16789: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Friday 20 September 2024 17:21:12 -0400 (0:00:00.148) 0:00:41.877 ****** 15247 1726867272.16816: entering _queue_task() for managed_node2/gather_facts 15247 1726867272.17144: worker is 1 (out of 1 available) 15247 1726867272.17155: exiting _queue_task() for managed_node2/gather_facts 15247 1726867272.17168: done queuing things up, now waiting for results queue to drain 15247 1726867272.17169: waiting for pending results... 15247 1726867272.17525: running TaskExecutor() for managed_node2/TASK: Gathering Facts 15247 1726867272.17621: in run() - task 0affcac9-a3a5-8ce3-1923-0000000004fa 15247 1726867272.17648: variable 'ansible_search_path' from source: unknown 15247 1726867272.17674: calling self._execute() 15247 1726867272.17743: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867272.17749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867272.17757: variable 'omit' from source: magic vars 15247 1726867272.18180: variable 'ansible_distribution_major_version' from source: facts 15247 1726867272.18211: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867272.18216: variable 'omit' from source: magic vars 15247 1726867272.18249: variable 'omit' from source: magic vars 15247 1726867272.18359: variable 'omit' from source: magic vars 15247 1726867272.18437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867272.18631: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867272.18634: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867272.18637: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867272.18691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867272.18787: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867272.18801: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867272.18810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867272.19004: Set connection var ansible_shell_executable to /bin/sh 15247 1726867272.19037: Set connection var ansible_connection to ssh 15247 1726867272.19051: Set connection var ansible_shell_type to sh 15247 1726867272.19055: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867272.19057: Set connection var ansible_timeout to 10 15247 1726867272.19059: Set connection var ansible_pipelining to False 15247 1726867272.19108: variable 'ansible_shell_executable' from source: unknown 15247 1726867272.19112: variable 'ansible_connection' from source: unknown 15247 1726867272.19114: variable 'ansible_module_compression' from source: unknown 15247 1726867272.19117: variable 'ansible_shell_type' from source: unknown 15247 1726867272.19119: variable 'ansible_shell_executable' from source: unknown 15247 1726867272.19124: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867272.19129: variable 'ansible_pipelining' from source: unknown 15247 1726867272.19131: variable 'ansible_timeout' from source: unknown 15247 1726867272.19133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867272.19342: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867272.19351: variable 'omit' from source: magic vars 15247 1726867272.19356: starting attempt loop 15247 1726867272.19358: running the handler 15247 1726867272.19386: variable 'ansible_facts' from source: unknown 15247 1726867272.19398: _low_level_execute_command(): starting 15247 1726867272.19405: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867272.20339: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867272.20445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867272.20507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867272.20520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867272.20546: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867272.20723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867272.22310: stdout chunk (state=3): >>>/root <<< 15247 1726867272.22440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867272.22456: stderr chunk (state=3): >>><<< 15247 1726867272.22469: stdout chunk (state=3): >>><<< 15247 1726867272.22489: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867272.22499: _low_level_execute_command(): starting 15247 1726867272.22505: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867272.2248929-17158-226674911767482 `" && echo ansible-tmp-1726867272.2248929-17158-226674911767482="` echo /root/.ansible/tmp/ansible-tmp-1726867272.2248929-17158-226674911767482 `" ) && sleep 0' 15247 1726867272.22917: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867272.22921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867272.22924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration <<< 15247 1726867272.22932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867272.22934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867272.22990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867272.22994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867272.23033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867272.24954: stdout chunk (state=3): >>>ansible-tmp-1726867272.2248929-17158-226674911767482=/root/.ansible/tmp/ansible-tmp-1726867272.2248929-17158-226674911767482 <<< 15247 1726867272.25157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867272.25160: stdout chunk (state=3): >>><<< 15247 1726867272.25163: stderr chunk (state=3): >>><<< 15247 1726867272.25189: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867272.2248929-17158-226674911767482=/root/.ansible/tmp/ansible-tmp-1726867272.2248929-17158-226674911767482 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867272.25392: variable 'ansible_module_compression' from source: unknown 15247 1726867272.25395: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 15247 1726867272.25397: variable 'ansible_facts' from source: unknown 15247 1726867272.25673: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867272.2248929-17158-226674911767482/AnsiballZ_setup.py 15247 1726867272.25857: Sending initial data 15247 1726867272.25868: Sent initial data (154 bytes) 15247 1726867272.26850: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867272.26904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867272.27056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867272.27060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867272.27174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867272.27216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867272.28856: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867272.28905: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867272.28953: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpi50ie2g_ /root/.ansible/tmp/ansible-tmp-1726867272.2248929-17158-226674911767482/AnsiballZ_setup.py <<< 15247 1726867272.28957: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867272.2248929-17158-226674911767482/AnsiballZ_setup.py" <<< 15247 1726867272.29009: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmpi50ie2g_" to remote "/root/.ansible/tmp/ansible-tmp-1726867272.2248929-17158-226674911767482/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867272.2248929-17158-226674911767482/AnsiballZ_setup.py" <<< 15247 1726867272.30391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867272.30452: stderr chunk (state=3): >>><<< 15247 1726867272.30485: stdout chunk (state=3): >>><<< 15247 1726867272.30497: done transferring module to remote 15247 1726867272.30511: _low_level_execute_command(): starting 15247 1726867272.30520: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867272.2248929-17158-226674911767482/ /root/.ansible/tmp/ansible-tmp-1726867272.2248929-17158-226674911767482/AnsiballZ_setup.py && sleep 0' 15247 1726867272.31287: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867272.31302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867272.31318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867272.31336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867272.31453: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867272.31456: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867272.31693: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867272.31864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867272.31909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867272.33831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867272.33834: stdout chunk (state=3): >>><<< 15247 1726867272.33837: stderr chunk (state=3): >>><<< 15247 1726867272.33976: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867272.33983: _low_level_execute_command(): starting 15247 1726867272.33985: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867272.2248929-17158-226674911767482/AnsiballZ_setup.py && sleep 0' 15247 1726867272.35044: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867272.35047: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867272.35059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867272.35075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867272.35192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867272.35397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867272.35410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867272.35489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867272.99779: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "12", "epoch": "1726867272", "epoch_int": "1726867272", "date": "2024-09-20", "time": "17:21:12", "iso8601_micro": "2024-09-20T21:21:12.628922Z", "iso8601": "2024-09-20T21:21:12Z", "iso8601_basic": "20240920T172112628922", "iso8601_basic_short": "20240920T172112", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enfor<<< 15247 1726867272.99806: stdout chunk (state=3): >>>ced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_loadavg": {"1m": 0.6494140625, "5m": 0.416015625, "15m": 0.205078125}, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2955, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 576, "free": 2955}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 510, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794816000, "block_size": 4096, "block_total": 65519099, "block_available": 63914750, "block_used": 1604349, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 15247 1726867273.01830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867273.01855: stderr chunk (state=3): >>><<< 15247 1726867273.01859: stdout chunk (state=3): >>><<< 15247 1726867273.01891: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCtb6d2lCF5j73bQuklHmiwgIkrtsKyvhhqKmw8PAzxlwefW/eKAWRL8t5o4OpguzwN7Xas0KL/3n2vcGn89eWwkJ64NRFeevRauyAYYJ7nZE1QwJ9dGZo3zaVp/oC1MhHRdrICrLbAyfRiW6jeK2b1Ft+mdFsl86F6MUriI4qIiDp6FW5cChFc/iqHkG0NrVcTWrza0Es3nS/Vb6349spn+wBwFoB8hnx62+ER/+0mlRFjmDZxUqXJrfrBV2J72kQoHDeAgVQq1eQR36osPevJGc/pnNwDx/wf8WRSXPpRn9YbagddfgHFZCnbCFTk9YyiwyK1U/LZ9lIBSiUj976pi440fQTwjmVQAe/qHANcHOSrfP9jYcRbhARGCv4wQ3AgjnHnPA133ZmzX10g6C8WgEQGYuuOF4B+PCLLUedGyOw1cG8VBPS5TS6vnCTX5Gzk7SSdeLXP4fKdYMINUli7zoTEoxkmQiMJGTp4ydGu57F+aQ9yu3smHITyKHD7K10=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAv/YAwk9YsHU5O92V8EtqxoQj/LDcsKo4HtikGRATr3RtxreTXeA3ChH0hadXwgOPbV3d9lKKbe/T8Kk3p3CL8=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIHytSiqr0SQwJilSJH3MYhh2kiNKutXw14J1DPufbDt8", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "21", "second": "12", "epoch": "1726867272", "epoch_int": "1726867272", "date": "2024-09-20", "time": "17:21:12", "iso8601_micro": "2024-09-20T21:21:12.628922Z", "iso8601": "2024-09-20T21:21:12Z", "iso8601_basic": "20240920T172112628922", "iso8601_basic_short": "20240920T172112", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-116", "ansible_nodename": "ip-10-31-12-116.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273454a5a8b2a199265679d6a78897", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 54464 10.31.12.116 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 54464 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_loadavg": {"1m": 0.6494140625, "5m": 0.416015625, "15m": 0.205078125}, "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:d5ff:fec3:77ad", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.116", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:d5:c3:77:ad", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.116"], "ansible_all_ipv6_addresses": ["fe80::8ff:d5ff:fec3:77ad"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.116", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:d5ff:fec3:77ad"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2955, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 576, "free": 2955}, "nocache": {"free": 3292, "used": 239}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_uuid": "ec273454-a5a8-b2a1-9926-5679d6a78897", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 510, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261794816000, "block_size": 4096, "block_total": 65519099, "block_available": 63914750, "block_used": 1604349, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867273.02207: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867272.2248929-17158-226674911767482/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867273.02226: _low_level_execute_command(): starting 15247 1726867273.02229: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867272.2248929-17158-226674911767482/ > /dev/null 2>&1 && sleep 0' 15247 1726867273.02659: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867273.02663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867273.02665: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867273.02667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867273.02669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867273.02749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867273.02798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867273.04790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867273.04793: stdout chunk (state=3): >>><<< 15247 1726867273.04797: stderr chunk (state=3): >>><<< 15247 1726867273.04802: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867273.04804: handler run complete 15247 1726867273.04943: variable 'ansible_facts' from source: unknown 15247 1726867273.05150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867273.05598: variable 'ansible_facts' from source: unknown 15247 1726867273.05650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867273.05769: attempt loop complete, returning result 15247 1726867273.05773: _execute() done 15247 1726867273.05780: dumping result to json 15247 1726867273.05829: done dumping result, returning 15247 1726867273.05832: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [0affcac9-a3a5-8ce3-1923-0000000004fa] 15247 1726867273.05834: sending task result for task 0affcac9-a3a5-8ce3-1923-0000000004fa ok: [managed_node2] 15247 1726867273.06548: no more pending results, returning what we have 15247 1726867273.06551: results queue empty 15247 1726867273.06551: checking for any_errors_fatal 15247 1726867273.06552: done checking for any_errors_fatal 15247 1726867273.06553: checking for max_fail_percentage 15247 1726867273.06554: done checking for max_fail_percentage 15247 1726867273.06555: checking to see if all hosts have failed and the running result is not ok 15247 1726867273.06557: done checking to see if all hosts have failed 15247 1726867273.06558: getting the remaining hosts for this loop 15247 1726867273.06559: done getting the remaining hosts for this loop 15247 1726867273.06566: getting the next task for host managed_node2 15247 1726867273.06573: done getting next task for host managed_node2 15247 1726867273.06576: ^ task is: TASK: meta (flush_handlers) 15247 1726867273.06579: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867273.06583: getting variables 15247 1726867273.06584: in VariableManager get_vars() 15247 1726867273.06610: Calling all_inventory to load vars for managed_node2 15247 1726867273.06613: Calling groups_inventory to load vars for managed_node2 15247 1726867273.06616: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867273.06627: Calling all_plugins_play to load vars for managed_node2 15247 1726867273.06630: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867273.06634: Calling groups_plugins_play to load vars for managed_node2 15247 1726867273.07291: done sending task result for task 0affcac9-a3a5-8ce3-1923-0000000004fa 15247 1726867273.07294: WORKER PROCESS EXITING 15247 1726867273.07561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867273.08729: done with get_vars() 15247 1726867273.08744: done getting variables 15247 1726867273.08795: in VariableManager get_vars() 15247 1726867273.08802: Calling all_inventory to load vars for managed_node2 15247 1726867273.08803: Calling groups_inventory to load vars for managed_node2 15247 1726867273.08805: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867273.08808: Calling all_plugins_play to load vars for managed_node2 15247 1726867273.08809: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867273.08811: Calling groups_plugins_play to load vars for managed_node2 15247 1726867273.09742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867273.11166: done with get_vars() 15247 1726867273.11186: done queuing things up, now waiting for results queue to drain 15247 1726867273.11187: results queue empty 15247 1726867273.11188: checking for any_errors_fatal 15247 1726867273.11190: done checking for any_errors_fatal 15247 1726867273.11190: checking for max_fail_percentage 15247 1726867273.11191: done checking for max_fail_percentage 15247 1726867273.11195: checking to see if all hosts have failed and the running result is not ok 15247 1726867273.11196: done checking to see if all hosts have failed 15247 1726867273.11196: getting the remaining hosts for this loop 15247 1726867273.11197: done getting the remaining hosts for this loop 15247 1726867273.11198: getting the next task for host managed_node2 15247 1726867273.11201: done getting next task for host managed_node2 15247 1726867273.11202: ^ task is: TASK: Verify network state restored to default 15247 1726867273.11203: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867273.11204: getting variables 15247 1726867273.11205: in VariableManager get_vars() 15247 1726867273.11210: Calling all_inventory to load vars for managed_node2 15247 1726867273.11212: Calling groups_inventory to load vars for managed_node2 15247 1726867273.11213: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867273.11217: Calling all_plugins_play to load vars for managed_node2 15247 1726867273.11220: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867273.11223: Calling groups_plugins_play to load vars for managed_node2 15247 1726867273.12198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867273.13044: done with get_vars() 15247 1726867273.13059: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:67 Friday 20 September 2024 17:21:13 -0400 (0:00:00.962) 0:00:42.840 ****** 15247 1726867273.13110: entering _queue_task() for managed_node2/include_tasks 15247 1726867273.13335: worker is 1 (out of 1 available) 15247 1726867273.13349: exiting _queue_task() for managed_node2/include_tasks 15247 1726867273.13360: done queuing things up, now waiting for results queue to drain 15247 1726867273.13362: waiting for pending results... 15247 1726867273.13538: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 15247 1726867273.13610: in run() - task 0affcac9-a3a5-8ce3-1923-00000000007a 15247 1726867273.13624: variable 'ansible_search_path' from source: unknown 15247 1726867273.13653: calling self._execute() 15247 1726867273.13721: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867273.13727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867273.13736: variable 'omit' from source: magic vars 15247 1726867273.14082: variable 'ansible_distribution_major_version' from source: facts 15247 1726867273.14085: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867273.14089: _execute() done 15247 1726867273.14092: dumping result to json 15247 1726867273.14095: done dumping result, returning 15247 1726867273.14096: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [0affcac9-a3a5-8ce3-1923-00000000007a] 15247 1726867273.14111: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000007a 15247 1726867273.14209: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000007a 15247 1726867273.14211: WORKER PROCESS EXITING 15247 1726867273.14265: no more pending results, returning what we have 15247 1726867273.14270: in VariableManager get_vars() 15247 1726867273.14304: Calling all_inventory to load vars for managed_node2 15247 1726867273.14306: Calling groups_inventory to load vars for managed_node2 15247 1726867273.14309: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867273.14321: Calling all_plugins_play to load vars for managed_node2 15247 1726867273.14323: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867273.14326: Calling groups_plugins_play to load vars for managed_node2 15247 1726867273.15156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867273.16168: done with get_vars() 15247 1726867273.16183: variable 'ansible_search_path' from source: unknown 15247 1726867273.16193: we have included files to process 15247 1726867273.16193: generating all_blocks data 15247 1726867273.16194: done generating all_blocks data 15247 1726867273.16195: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15247 1726867273.16195: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15247 1726867273.16197: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 15247 1726867273.16449: done processing included file 15247 1726867273.16450: iterating over new_blocks loaded from include file 15247 1726867273.16452: in VariableManager get_vars() 15247 1726867273.16460: done with get_vars() 15247 1726867273.16461: filtering new block on tags 15247 1726867273.16471: done filtering new block on tags 15247 1726867273.16472: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 15247 1726867273.16475: extending task lists for all hosts with included blocks 15247 1726867273.16497: done extending task lists 15247 1726867273.16498: done processing included files 15247 1726867273.16498: results queue empty 15247 1726867273.16499: checking for any_errors_fatal 15247 1726867273.16499: done checking for any_errors_fatal 15247 1726867273.16500: checking for max_fail_percentage 15247 1726867273.16500: done checking for max_fail_percentage 15247 1726867273.16501: checking to see if all hosts have failed and the running result is not ok 15247 1726867273.16501: done checking to see if all hosts have failed 15247 1726867273.16502: getting the remaining hosts for this loop 15247 1726867273.16503: done getting the remaining hosts for this loop 15247 1726867273.16504: getting the next task for host managed_node2 15247 1726867273.16506: done getting next task for host managed_node2 15247 1726867273.16508: ^ task is: TASK: Check routes and DNS 15247 1726867273.16509: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867273.16510: getting variables 15247 1726867273.16511: in VariableManager get_vars() 15247 1726867273.16518: Calling all_inventory to load vars for managed_node2 15247 1726867273.16520: Calling groups_inventory to load vars for managed_node2 15247 1726867273.16521: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867273.16524: Calling all_plugins_play to load vars for managed_node2 15247 1726867273.16526: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867273.16527: Calling groups_plugins_play to load vars for managed_node2 15247 1726867273.17214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867273.18398: done with get_vars() 15247 1726867273.18418: done getting variables 15247 1726867273.18468: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 17:21:13 -0400 (0:00:00.053) 0:00:42.894 ****** 15247 1726867273.18503: entering _queue_task() for managed_node2/shell 15247 1726867273.18800: worker is 1 (out of 1 available) 15247 1726867273.18812: exiting _queue_task() for managed_node2/shell 15247 1726867273.18823: done queuing things up, now waiting for results queue to drain 15247 1726867273.18825: waiting for pending results... 15247 1726867273.19151: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 15247 1726867273.19207: in run() - task 0affcac9-a3a5-8ce3-1923-00000000050b 15247 1726867273.19218: variable 'ansible_search_path' from source: unknown 15247 1726867273.19221: variable 'ansible_search_path' from source: unknown 15247 1726867273.19417: calling self._execute() 15247 1726867273.19421: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867273.19429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867273.19432: variable 'omit' from source: magic vars 15247 1726867273.19921: variable 'ansible_distribution_major_version' from source: facts 15247 1726867273.19925: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867273.19941: variable 'omit' from source: magic vars 15247 1726867273.19944: variable 'omit' from source: magic vars 15247 1726867273.19946: variable 'omit' from source: magic vars 15247 1726867273.19948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867273.20009: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867273.20013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867273.20038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867273.20041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867273.20066: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867273.20078: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867273.20081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867273.20153: Set connection var ansible_shell_executable to /bin/sh 15247 1726867273.20208: Set connection var ansible_connection to ssh 15247 1726867273.20212: Set connection var ansible_shell_type to sh 15247 1726867273.20217: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867273.20219: Set connection var ansible_timeout to 10 15247 1726867273.20221: Set connection var ansible_pipelining to False 15247 1726867273.20224: variable 'ansible_shell_executable' from source: unknown 15247 1726867273.20226: variable 'ansible_connection' from source: unknown 15247 1726867273.20229: variable 'ansible_module_compression' from source: unknown 15247 1726867273.20231: variable 'ansible_shell_type' from source: unknown 15247 1726867273.20233: variable 'ansible_shell_executable' from source: unknown 15247 1726867273.20235: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867273.20237: variable 'ansible_pipelining' from source: unknown 15247 1726867273.20267: variable 'ansible_timeout' from source: unknown 15247 1726867273.20270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867273.20364: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867273.20466: variable 'omit' from source: magic vars 15247 1726867273.20469: starting attempt loop 15247 1726867273.20472: running the handler 15247 1726867273.20475: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867273.20480: _low_level_execute_command(): starting 15247 1726867273.20482: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867273.21036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867273.21040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867273.21045: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867273.21047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867273.21103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867273.21109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867273.21111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867273.21156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867273.22860: stdout chunk (state=3): >>>/root <<< 15247 1726867273.23030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867273.23129: stderr chunk (state=3): >>><<< 15247 1726867273.23136: stdout chunk (state=3): >>><<< 15247 1726867273.23285: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867273.23288: _low_level_execute_command(): starting 15247 1726867273.23291: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867273.231869-17196-94617433404901 `" && echo ansible-tmp-1726867273.231869-17196-94617433404901="` echo /root/.ansible/tmp/ansible-tmp-1726867273.231869-17196-94617433404901 `" ) && sleep 0' 15247 1726867273.23883: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867273.23902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867273.23960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867273.24085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867273.26283: stdout chunk (state=3): >>>ansible-tmp-1726867273.231869-17196-94617433404901=/root/.ansible/tmp/ansible-tmp-1726867273.231869-17196-94617433404901 <<< 15247 1726867273.26322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867273.26325: stdout chunk (state=3): >>><<< 15247 1726867273.26327: stderr chunk (state=3): >>><<< 15247 1726867273.26334: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867273.231869-17196-94617433404901=/root/.ansible/tmp/ansible-tmp-1726867273.231869-17196-94617433404901 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867273.26411: variable 'ansible_module_compression' from source: unknown 15247 1726867273.26540: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15247 1726867273.26626: variable 'ansible_facts' from source: unknown 15247 1726867273.26804: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867273.231869-17196-94617433404901/AnsiballZ_command.py 15247 1726867273.27105: Sending initial data 15247 1726867273.27108: Sent initial data (154 bytes) 15247 1726867273.27619: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867273.27638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867273.27649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867273.27694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867273.27712: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867273.27752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867273.29812: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867273.29850: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867273.29901: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp_zhxohye /root/.ansible/tmp/ansible-tmp-1726867273.231869-17196-94617433404901/AnsiballZ_command.py <<< 15247 1726867273.29905: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867273.231869-17196-94617433404901/AnsiballZ_command.py" <<< 15247 1726867273.29936: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp_zhxohye" to remote "/root/.ansible/tmp/ansible-tmp-1726867273.231869-17196-94617433404901/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867273.231869-17196-94617433404901/AnsiballZ_command.py" <<< 15247 1726867273.30859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867273.30983: stderr chunk (state=3): >>><<< 15247 1726867273.30986: stdout chunk (state=3): >>><<< 15247 1726867273.30988: done transferring module to remote 15247 1726867273.30990: _low_level_execute_command(): starting 15247 1726867273.30992: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867273.231869-17196-94617433404901/ /root/.ansible/tmp/ansible-tmp-1726867273.231869-17196-94617433404901/AnsiballZ_command.py && sleep 0' 15247 1726867273.32233: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867273.32245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867273.32296: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867273.32317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867273.32330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867273.32425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867273.34208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867273.34233: stderr chunk (state=3): >>><<< 15247 1726867273.34242: stdout chunk (state=3): >>><<< 15247 1726867273.34270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867273.34452: _low_level_execute_command(): starting 15247 1726867273.34455: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867273.231869-17196-94617433404901/AnsiballZ_command.py && sleep 0' 15247 1726867273.35504: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867273.35667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867273.35693: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867273.35801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867273.52510: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:d5:c3:77:ad brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.12.116/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3373sec preferred_lft 3373sec\n inet6 fe80::8ff:d5ff:fec3:77ad/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.116 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.116 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 17:21:13.514823", "end": "2024-09-20 17:21:13.523697", "delta": "0:00:00.008874", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15247 1726867273.54103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. <<< 15247 1726867273.54107: stdout chunk (state=3): >>><<< 15247 1726867273.54110: stderr chunk (state=3): >>><<< 15247 1726867273.54130: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:d5:c3:77:ad brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.12.116/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3373sec preferred_lft 3373sec\n inet6 fe80::8ff:d5ff:fec3:77ad/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.116 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.116 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 17:21:13.514823", "end": "2024-09-20 17:21:13.523697", "delta": "0:00:00.008874", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867273.54409: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867273.231869-17196-94617433404901/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867273.54412: _low_level_execute_command(): starting 15247 1726867273.54415: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867273.231869-17196-94617433404901/ > /dev/null 2>&1 && sleep 0' 15247 1726867273.55529: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867273.55533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found <<< 15247 1726867273.55535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 15247 1726867273.55537: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867273.55539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found <<< 15247 1726867273.55541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867273.55692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867273.55793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867273.55853: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867273.57707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867273.57737: stderr chunk (state=3): >>><<< 15247 1726867273.57747: stdout chunk (state=3): >>><<< 15247 1726867273.57817: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867273.57849: handler run complete 15247 1726867273.57983: Evaluated conditional (False): False 15247 1726867273.57986: attempt loop complete, returning result 15247 1726867273.57988: _execute() done 15247 1726867273.57990: dumping result to json 15247 1726867273.57992: done dumping result, returning 15247 1726867273.57994: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [0affcac9-a3a5-8ce3-1923-00000000050b] 15247 1726867273.58094: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000050b 15247 1726867273.58557: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000050b 15247 1726867273.58561: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008874", "end": "2024-09-20 17:21:13.523697", "rc": 0, "start": "2024-09-20 17:21:13.514823" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:d5:c3:77:ad brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.12.116/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3373sec preferred_lft 3373sec inet6 fe80::8ff:d5ff:fec3:77ad/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.116 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.116 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 15247 1726867273.58658: no more pending results, returning what we have 15247 1726867273.58662: results queue empty 15247 1726867273.58663: checking for any_errors_fatal 15247 1726867273.58665: done checking for any_errors_fatal 15247 1726867273.58666: checking for max_fail_percentage 15247 1726867273.58667: done checking for max_fail_percentage 15247 1726867273.58668: checking to see if all hosts have failed and the running result is not ok 15247 1726867273.58670: done checking to see if all hosts have failed 15247 1726867273.58670: getting the remaining hosts for this loop 15247 1726867273.58672: done getting the remaining hosts for this loop 15247 1726867273.58676: getting the next task for host managed_node2 15247 1726867273.58685: done getting next task for host managed_node2 15247 1726867273.58688: ^ task is: TASK: Verify DNS and network connectivity 15247 1726867273.58691: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867273.58695: getting variables 15247 1726867273.58697: in VariableManager get_vars() 15247 1726867273.58733: Calling all_inventory to load vars for managed_node2 15247 1726867273.58736: Calling groups_inventory to load vars for managed_node2 15247 1726867273.58740: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867273.58753: Calling all_plugins_play to load vars for managed_node2 15247 1726867273.58756: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867273.58759: Calling groups_plugins_play to load vars for managed_node2 15247 1726867273.60676: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867273.62695: done with get_vars() 15247 1726867273.62718: done getting variables 15247 1726867273.63005: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 17:21:13 -0400 (0:00:00.445) 0:00:43.340 ****** 15247 1726867273.63064: entering _queue_task() for managed_node2/shell 15247 1726867273.63356: worker is 1 (out of 1 available) 15247 1726867273.63368: exiting _queue_task() for managed_node2/shell 15247 1726867273.63383: done queuing things up, now waiting for results queue to drain 15247 1726867273.63385: waiting for pending results... 15247 1726867273.63797: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 15247 1726867273.63802: in run() - task 0affcac9-a3a5-8ce3-1923-00000000050c 15247 1726867273.63805: variable 'ansible_search_path' from source: unknown 15247 1726867273.63808: variable 'ansible_search_path' from source: unknown 15247 1726867273.63882: calling self._execute() 15247 1726867273.63996: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867273.64013: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867273.64030: variable 'omit' from source: magic vars 15247 1726867273.64517: variable 'ansible_distribution_major_version' from source: facts 15247 1726867273.64567: Evaluated conditional (ansible_distribution_major_version != '6'): True 15247 1726867273.64764: variable 'ansible_facts' from source: unknown 15247 1726867273.66049: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 15247 1726867273.66062: variable 'omit' from source: magic vars 15247 1726867273.66121: variable 'omit' from source: magic vars 15247 1726867273.66159: variable 'omit' from source: magic vars 15247 1726867273.66206: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 15247 1726867273.66251: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 15247 1726867273.66331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 15247 1726867273.66335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867273.66337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 15247 1726867273.66349: variable 'inventory_hostname' from source: host vars for 'managed_node2' 15247 1726867273.66356: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867273.66364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867273.66474: Set connection var ansible_shell_executable to /bin/sh 15247 1726867273.66486: Set connection var ansible_connection to ssh 15247 1726867273.66493: Set connection var ansible_shell_type to sh 15247 1726867273.66503: Set connection var ansible_module_compression to ZIP_DEFLATED 15247 1726867273.66515: Set connection var ansible_timeout to 10 15247 1726867273.66524: Set connection var ansible_pipelining to False 15247 1726867273.66581: variable 'ansible_shell_executable' from source: unknown 15247 1726867273.66584: variable 'ansible_connection' from source: unknown 15247 1726867273.66587: variable 'ansible_module_compression' from source: unknown 15247 1726867273.66589: variable 'ansible_shell_type' from source: unknown 15247 1726867273.66591: variable 'ansible_shell_executable' from source: unknown 15247 1726867273.66592: variable 'ansible_host' from source: host vars for 'managed_node2' 15247 1726867273.66594: variable 'ansible_pipelining' from source: unknown 15247 1726867273.66596: variable 'ansible_timeout' from source: unknown 15247 1726867273.66598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 15247 1726867273.66741: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867273.66763: variable 'omit' from source: magic vars 15247 1726867273.66872: starting attempt loop 15247 1726867273.66875: running the handler 15247 1726867273.66880: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 15247 1726867273.66882: _low_level_execute_command(): starting 15247 1726867273.66884: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 15247 1726867273.67537: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867273.67614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867273.67646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867273.67670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867273.67873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867273.69534: stdout chunk (state=3): >>>/root <<< 15247 1726867273.69661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867273.69674: stdout chunk (state=3): >>><<< 15247 1726867273.69702: stderr chunk (state=3): >>><<< 15247 1726867273.69723: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867273.69741: _low_level_execute_command(): starting 15247 1726867273.69751: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867273.6972966-17218-222960441499546 `" && echo ansible-tmp-1726867273.6972966-17218-222960441499546="` echo /root/.ansible/tmp/ansible-tmp-1726867273.6972966-17218-222960441499546 `" ) && sleep 0' 15247 1726867273.70318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867273.70336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867273.70359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867273.70381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867273.70400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 <<< 15247 1726867273.70412: stderr chunk (state=3): >>>debug2: match not found <<< 15247 1726867273.70426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867273.70494: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867273.70532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867273.70555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867273.70614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867273.70631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867273.72517: stdout chunk (state=3): >>>ansible-tmp-1726867273.6972966-17218-222960441499546=/root/.ansible/tmp/ansible-tmp-1726867273.6972966-17218-222960441499546 <<< 15247 1726867273.72680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867273.72683: stdout chunk (state=3): >>><<< 15247 1726867273.72686: stderr chunk (state=3): >>><<< 15247 1726867273.72700: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867273.6972966-17218-222960441499546=/root/.ansible/tmp/ansible-tmp-1726867273.6972966-17218-222960441499546 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867273.72887: variable 'ansible_module_compression' from source: unknown 15247 1726867273.72890: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-15247p_b7opb1/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 15247 1726867273.72892: variable 'ansible_facts' from source: unknown 15247 1726867273.72910: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867273.6972966-17218-222960441499546/AnsiballZ_command.py 15247 1726867273.73126: Sending initial data 15247 1726867273.73128: Sent initial data (156 bytes) 15247 1726867273.74100: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867273.74161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' <<< 15247 1726867273.74179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867273.74232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867273.74386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867273.75969: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 15247 1726867273.75997: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 15247 1726867273.76050: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 15247 1726867273.76125: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp0fyke9c9 /root/.ansible/tmp/ansible-tmp-1726867273.6972966-17218-222960441499546/AnsiballZ_command.py <<< 15247 1726867273.76140: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867273.6972966-17218-222960441499546/AnsiballZ_command.py" <<< 15247 1726867273.76158: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 15247 1726867273.76192: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-15247p_b7opb1/tmp0fyke9c9" to remote "/root/.ansible/tmp/ansible-tmp-1726867273.6972966-17218-222960441499546/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867273.6972966-17218-222960441499546/AnsiballZ_command.py" <<< 15247 1726867273.77058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867273.77061: stdout chunk (state=3): >>><<< 15247 1726867273.77063: stderr chunk (state=3): >>><<< 15247 1726867273.77081: done transferring module to remote 15247 1726867273.77096: _low_level_execute_command(): starting 15247 1726867273.77105: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867273.6972966-17218-222960441499546/ /root/.ansible/tmp/ansible-tmp-1726867273.6972966-17218-222960441499546/AnsiballZ_command.py && sleep 0' 15247 1726867273.77908: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867273.77920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867273.77938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867273.77961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867273.78098: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867273.78251: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867273.78259: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867273.78314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867273.80089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867273.80132: stderr chunk (state=3): >>><<< 15247 1726867273.80135: stdout chunk (state=3): >>><<< 15247 1726867273.80151: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867273.80226: _low_level_execute_command(): starting 15247 1726867273.80229: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867273.6972966-17218-222960441499546/AnsiballZ_command.py && sleep 0' 15247 1726867273.80724: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 15247 1726867273.80743: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 15247 1726867273.80768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 15247 1726867273.80788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 15247 1726867273.80930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867273.80986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867274.30164: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3910 0 --:--:-- --:--:-- --:--:-- 3961\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1364 0 --:--:-- --:--:-- --:--:-- 1359", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 17:21:13.964079", "end": "2024-09-20 17:21:14.298835", "delta": "0:00:00.334756", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 15247 1726867274.31688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867274.31746: stderr chunk (state=3): >>>Shared connection to 10.31.12.116 closed. <<< 15247 1726867274.31754: stdout chunk (state=3): >>><<< 15247 1726867274.31762: stderr chunk (state=3): >>><<< 15247 1726867274.31788: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 3910 0 --:--:-- --:--:-- --:--:-- 3961\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1364 0 --:--:-- --:--:-- --:--:-- 1359", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 17:21:13.964079", "end": "2024-09-20 17:21:14.298835", "delta": "0:00:00.334756", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.116 closed. 15247 1726867274.31936: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867273.6972966-17218-222960441499546/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 15247 1726867274.31940: _low_level_execute_command(): starting 15247 1726867274.31942: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867273.6972966-17218-222960441499546/ > /dev/null 2>&1 && sleep 0' 15247 1726867274.33194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 15247 1726867274.33344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK <<< 15247 1726867274.33388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 15247 1726867274.33440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 15247 1726867274.35348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 15247 1726867274.35357: stdout chunk (state=3): >>><<< 15247 1726867274.35370: stderr chunk (state=3): >>><<< 15247 1726867274.35404: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.116 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.116 originally 10.31.12.116 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/1ce91f36e8' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 15247 1726867274.35418: handler run complete 15247 1726867274.35444: Evaluated conditional (False): False 15247 1726867274.35460: attempt loop complete, returning result 15247 1726867274.35468: _execute() done 15247 1726867274.35474: dumping result to json 15247 1726867274.35495: done dumping result, returning 15247 1726867274.35507: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [0affcac9-a3a5-8ce3-1923-00000000050c] 15247 1726867274.35520: sending task result for task 0affcac9-a3a5-8ce3-1923-00000000050c 15247 1726867274.35839: done sending task result for task 0affcac9-a3a5-8ce3-1923-00000000050c 15247 1726867274.35843: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.334756", "end": "2024-09-20 17:21:14.298835", "rc": 0, "start": "2024-09-20 17:21:13.964079" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 3910 0 --:--:-- --:--:-- --:--:-- 3961 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1364 0 --:--:-- --:--:-- --:--:-- 1359 15247 1726867274.35916: no more pending results, returning what we have 15247 1726867274.35919: results queue empty 15247 1726867274.35920: checking for any_errors_fatal 15247 1726867274.35929: done checking for any_errors_fatal 15247 1726867274.35930: checking for max_fail_percentage 15247 1726867274.35932: done checking for max_fail_percentage 15247 1726867274.35933: checking to see if all hosts have failed and the running result is not ok 15247 1726867274.35934: done checking to see if all hosts have failed 15247 1726867274.35935: getting the remaining hosts for this loop 15247 1726867274.35936: done getting the remaining hosts for this loop 15247 1726867274.35940: getting the next task for host managed_node2 15247 1726867274.35948: done getting next task for host managed_node2 15247 1726867274.35957: ^ task is: TASK: meta (flush_handlers) 15247 1726867274.35959: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867274.35964: getting variables 15247 1726867274.35966: in VariableManager get_vars() 15247 1726867274.36064: Calling all_inventory to load vars for managed_node2 15247 1726867274.36067: Calling groups_inventory to load vars for managed_node2 15247 1726867274.36071: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867274.36309: Calling all_plugins_play to load vars for managed_node2 15247 1726867274.36314: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867274.36318: Calling groups_plugins_play to load vars for managed_node2 15247 1726867274.38416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867274.40202: done with get_vars() 15247 1726867274.40223: done getting variables 15247 1726867274.40294: in VariableManager get_vars() 15247 1726867274.40303: Calling all_inventory to load vars for managed_node2 15247 1726867274.40305: Calling groups_inventory to load vars for managed_node2 15247 1726867274.40307: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867274.40312: Calling all_plugins_play to load vars for managed_node2 15247 1726867274.40314: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867274.40317: Calling groups_plugins_play to load vars for managed_node2 15247 1726867274.42692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867274.44729: done with get_vars() 15247 1726867274.44755: done queuing things up, now waiting for results queue to drain 15247 1726867274.44757: results queue empty 15247 1726867274.44758: checking for any_errors_fatal 15247 1726867274.44766: done checking for any_errors_fatal 15247 1726867274.44767: checking for max_fail_percentage 15247 1726867274.44768: done checking for max_fail_percentage 15247 1726867274.44769: checking to see if all hosts have failed and the running result is not ok 15247 1726867274.44770: done checking to see if all hosts have failed 15247 1726867274.44771: getting the remaining hosts for this loop 15247 1726867274.44771: done getting the remaining hosts for this loop 15247 1726867274.44774: getting the next task for host managed_node2 15247 1726867274.44780: done getting next task for host managed_node2 15247 1726867274.44802: ^ task is: TASK: meta (flush_handlers) 15247 1726867274.44804: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867274.44808: getting variables 15247 1726867274.44809: in VariableManager get_vars() 15247 1726867274.44818: Calling all_inventory to load vars for managed_node2 15247 1726867274.44822: Calling groups_inventory to load vars for managed_node2 15247 1726867274.44828: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867274.44834: Calling all_plugins_play to load vars for managed_node2 15247 1726867274.44836: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867274.44857: Calling groups_plugins_play to load vars for managed_node2 15247 1726867274.46092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867274.48606: done with get_vars() 15247 1726867274.48626: done getting variables 15247 1726867274.48680: in VariableManager get_vars() 15247 1726867274.48689: Calling all_inventory to load vars for managed_node2 15247 1726867274.48691: Calling groups_inventory to load vars for managed_node2 15247 1726867274.48694: Calling all_plugins_inventory to load vars for managed_node2 15247 1726867274.48698: Calling all_plugins_play to load vars for managed_node2 15247 1726867274.48714: Calling groups_plugins_inventory to load vars for managed_node2 15247 1726867274.48718: Calling groups_plugins_play to load vars for managed_node2 15247 1726867274.49911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 15247 1726867274.53618: done with get_vars() 15247 1726867274.53645: done queuing things up, now waiting for results queue to drain 15247 1726867274.53647: results queue empty 15247 1726867274.53648: checking for any_errors_fatal 15247 1726867274.53649: done checking for any_errors_fatal 15247 1726867274.53650: checking for max_fail_percentage 15247 1726867274.53651: done checking for max_fail_percentage 15247 1726867274.53652: checking to see if all hosts have failed and the running result is not ok 15247 1726867274.53653: done checking to see if all hosts have failed 15247 1726867274.53653: getting the remaining hosts for this loop 15247 1726867274.53654: done getting the remaining hosts for this loop 15247 1726867274.53657: getting the next task for host managed_node2 15247 1726867274.53661: done getting next task for host managed_node2 15247 1726867274.53662: ^ task is: None 15247 1726867274.53664: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 15247 1726867274.53665: done queuing things up, now waiting for results queue to drain 15247 1726867274.53666: results queue empty 15247 1726867274.53666: checking for any_errors_fatal 15247 1726867274.53667: done checking for any_errors_fatal 15247 1726867274.53668: checking for max_fail_percentage 15247 1726867274.53669: done checking for max_fail_percentage 15247 1726867274.53670: checking to see if all hosts have failed and the running result is not ok 15247 1726867274.53671: done checking to see if all hosts have failed 15247 1726867274.53672: getting the next task for host managed_node2 15247 1726867274.53674: done getting next task for host managed_node2 15247 1726867274.53675: ^ task is: None 15247 1726867274.53676: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=82 changed=3 unreachable=0 failed=0 skipped=71 rescued=0 ignored=2 Friday 20 September 2024 17:21:14 -0400 (0:00:00.907) 0:00:44.247 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.16s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.10s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.97s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.90s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 Gathering Facts --------------------------------------------------------- 1.19s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.12s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.05s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Gathering Facts --------------------------------------------------------- 1.05s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Gathering Facts --------------------------------------------------------- 1.04s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 1.01s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.98s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Check if system is ostree ----------------------------------------------- 0.98s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.96s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gathering Facts --------------------------------------------------------- 0.96s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Gathering Facts --------------------------------------------------------- 0.94s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Verify DNS and network connectivity ------------------------------------- 0.91s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Gathering Facts --------------------------------------------------------- 0.90s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Gathering Facts --------------------------------------------------------- 0.89s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gather the minimum subset of ansible_facts required by the network role test --- 0.85s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 15247 1726867274.54090: RUNNING CLEANUP